专利摘要:
An agricultural monitoring system, the agricultural monitoring system comprising: an image forming sensor, configured and operable to acquire image data in submillimeter image resolution of parts of an agricultural area in which crops grow, when the image forming sensor is airborne; a communication module, configured and operable to transmit to an external system image data content that is based on the image data acquired by the aerial image forming sensor; and an operable connector for connecting the image forming sensor and communication module to an aerial platform.
公开号:BR112017014855B1
申请号:R112017014855-2
申请日:2015-12-02
公开日:2021-06-08
发明作者:Amihay Gornik
申请人:A.A.A. Taranis Visual Ltd.;
IPC主号:
专利说明:

technical field
[001] The invention is related to systems, methods, and products of computer programs for agricultural monitoring, and more specifically with systems, methods and products of computer programs for agricultural monitoring that are based on image data acquired by a sensor of formation of aerial image. prior technique
[002] Chinese utility model serial number CN203528823 titled "Rice bacterial leaf blight preventing unmanned aerial vehicle with colored rice disease image identifier". rice disease] relates to an unmanned aerial vehicle for preventing rice leaf bacterial pest with a rice disease color image identifier, and belongs to the technical field of plant protection by agricultural aviation. The rice leaf bacterial pest prevention smart unmanned aerial vehicle flies over a rice field and detects the occurrence of leaf bacterial pest, a camera and a video camera mounted in a photoelectric capsule below the intelligent unmanned aerial vehicle for rice leaf bacterial pest prevention by feeding rice disease color images detected in the rice field to a rice disease color image storage system for storage, then the images are fed to the disease color image identifier rice disease and compared with standard color rice disease images, disease types and harmful situations are identified and confirmed, leaf bacterial pest damage information is fed to a computerized spray treatment instruction information system for processing, a spray treatment instruction is produced, a pressure pump applied. c pressure to chemical pesticide treatment liquid According to the spray treatment instruction, the pressurized chemical pesticide liquid is sprayed to the rice field by a leaf bacterial pest treatment chemical pesticide liquid sprayer.
[003] Chinese utility model serial number CN20353822 titled “Rice sheath blight disease preventing unmanned aerial vehicle with colored rice disease image identifier” of rice disease] relates to an unmanned aerial vehicle for prevention of rice husk pest disease with a rice disease color image identifier, and belongs to the technical field of plant protection by agricultural aviation. A video camera and camera mounted in a photoelectric capsule below the rice husk pest disease prevention smart unmanned aerial vehicle feed color images of rice disease in a rice field to a color image storage system of rice disease for storage, then the images are fed to the rice disease color image identifier and compared with standard color images of stored rice diseases, rice husk pest disease harmful situations are identified, harmful information of the peel pest diseases are fed to a computerized spray treatment instruction information system for processing, a spray treatment instruction transmitted by the computerized spray treatment instruction information system regulates the pressure applied by a pressure pump to a chemical pesticide treatment liquid through s of a spray treatment information transmission line, the pressurized chemical pesticide liquid is sprayed into the rice field by a husk pest disease treatment chemical pesticide liquid spray by regulation.
[004] Chinese patent application serial number CN103523226A entitled "Unmanned aerial vehicle with colorized rice disease image recognition and preventing and treating rice sheath blight deseases". for prevention and treatment of rice husk pest diseases] relates to an aerial vehicle with a rice disease color image recognition instrument and for prevention and treatment of rice husk pest diseases and belongs to the technical field of plant protection by agricultural aviation. Color images of rice disease in rice fields detected by vidicon and cameras in a photoelectric capsule below the intelligent unmanned aerial vehicle to prevent and treat rice husk pest diseases are fed to a color disease image storage system of rice to achieve storage, and then fed to the rice disease color image recognition instrument to be compared with standard rice disease color images to recognize rice husk disease risk situations. Harmful information from rice husk pest diseases is fed to a computerized spray treatment command information system. Spray treatment commands sent by the computerized spray treatment command information system control the pressure of a chemical pesticide treatment liquid pressure pump through a treatment command information transmission line and control the chemical pesticide liquid pressurized to be sprayed into the rice fields through a sprayer containing the chemical pesticide liquid to treat rice husk pest diseases.
[005] Japanese patent application serial number JPH11235124A entitled “Precise farming” discusses a method for accurately farming capable of preventing over- or under-application of fertilizers and pesticides, improving fertilizer application efficiencies and of pesticides and increasing crop yield by detecting the crop growth status of a farm field to automatically form a crop growth map of the farm field and subsequently apply fertilizers, pesticides, etc., based on the growth map data of the plantation formed. The patent application discusses a method for cultivating precisely comprising aerially photographing the crop growth state of a farm field, for example, with a camera 70 loaded in a helicopter, detecting the chlorophyll contents of the crops from the images taken with the color sensor camera 70 to detect the growth status of the farm field crop, and subsequently form the farm field crop growth map.
[006] United States patent application serial number US11/353,351 entitled “Irrigation remote sensing system” discusses a data collection device associated with an agricultural irrigation system including at least one camera mobile connected to the irrigation system. Invention Summary
[007] According to one aspect of the invention, a method for agricultural monitoring is disclosed, the method including: (a) flying an aerial image forming sensor along a trajectory over an agricultural area in which crops grow; (b) acquire by the aerial image forming sensor image data from parts of the agricultural area, where the acquisition of the image data is performed in a set of image formation sites along the flight path that allows the acquisition of the data from image in submillimeter image resolution; and (c) transmitting to an external system image data content that is based on the image data acquired by the aerial image forming sensor.
[008] According to a further aspect of the invention, the method may include transmitting the image data content to the external system for display to an agronomist at a remote location agronomic image data that is based on the image data content, thus allowing the agronomist to remotely analyze the agricultural area.
[009] According to a further aspect of the invention, the flight path is a terrain following the flight path.
[0010] According to a further aspect of the invention, the acquisition includes acquiring image data at the set of imaging sites while flying the aerial image forming sensor over the imaging sites at speeds that do not fall below 50% of the average speed of the aerial platform along the flight path.
[0011] According to a further aspect of the invention, the acquisition includes mechanically moving at least one component of the aerial image forming sensor with respect to an aerial transport platform, to compensate for the movement of the aerial image forming sensor with respect to the crops during acquisition.
[0012] According to a further aspect of the invention, the acquisition includes: (a) mechanically rotating at least one optical component of the aerial image forming sensor with respect to an aerial transport platform to compensate for movement of the image forming sensor air over plantations during acquisition; and (b) concurrently with the rotation of the at least one optical component, for each frame of a plurality of frames of image data: initiating an image forming sensor focusing process when an acquisition optical geometry axis is at a degree greater than 20° from the vertical geometry axis, and acquire the image data using vertical imaging when the acquisition optical geometry axis is at an angle less than 20° from the vertical geometry axis.
[0013] According to a further aspect of the invention, acquisition includes illuminating crops during acquisition to compensate for movement of the aerial imager sensor with respect to crops during acquisition.
[0014] According to a further aspect of the invention, flight includes flying the aerial image forming sensor along a flight path extending over at least a first farm of a first owner and a second farm of a second owner other than the first owner, the method including acquiring first image data of parts of the first farm and acquiring second image data of parts of the second farm; generating first image data content based on the first image data and generating second image data content based on the second image data; to provide the first image data content to a first entity in a first message, and to provide the second image data content to a second entity in a second message.
[0015] According to a further aspect of the invention, the acquisition includes acquiring image data of parts of the agricultural area that are inaccessible to land vehicles.
[0016] According to a further aspect of the invention, the acquisition includes acquiring image data of parts of the agricultural area that are inaccessible on foot.
[0017] According to a further aspect of the invention, flight includes flying the image forming sensor by an agricultural aircraft that is configured for aerial application of crop protection products.
[0018] According to a further aspect of the invention, the method further includes selecting aerial application parameters for aerial application of crop protection products by agricultural aircraft based on processing the image data.
[0019] According to a further aspect of the invention, the set of imaging sites along the aerial trajectory is located less than 20 meters above the top of plantations growing in the agricultural area.
[0020] According to a further aspect of the invention, the acquisition includes acquiring image data of the agricultural area at a coverage rate below 500 square meters per hectare.
[0021] According to a further aspect of the invention, the transmission is followed by subsequent instance of flight, acquisition and transmission, the method further including planning a trajectory for the subsequent instance of flight, based on the acquired image data in a previous instance of acquisition.
[0022] According to a further aspect of the invention, the acquisition includes compensating the movement of the image forming sensor during the acquisition of the image data.
[0023] According to a further aspect of the invention, acquiring the image data includes acquiring the image data using vertical imaging.
[0024] According to a further aspect of the invention, the method further includes applying computerized processing algorithms to image data content to detect leaf diseases or indication of leaf parasite effect on one or more plants in the agricultural area.
According to a further aspect of the invention, the flight, acquisition and transmission are repeated over multiple weeks, the method further including processing image data acquired at different times during the multiple weeks to determine growth parameters for plants in the agricultural area.
[0026] According to a further aspect of the invention, the method further includes applying computerized processing algorithms to the image data to identify agronomic significant data, and generating agronomic image data for transmission to a remote system based on the selected agronomic significant data.
[0027] According to a further aspect of the invention, the method further includes applying computerized processing algorithms to the selected agronomic significant data to select, from a plurality of recipients, a recipient for the agronomic image data, based on the agronomic experience of the possible recipients.
[0028] According to a further aspect of the invention, the flight is preceded by defining a surveillance flight plan for an air surveillance system, the surveillance flight plan including acquisition location plan indicative of a plurality of locations of imaging, where the aerial sensor flight is part of flying the aerial surveillance system along a flight path over an agricultural area, based on the surveillance flight plan.
[0029] According to a further aspect of the invention, the flight path is a flight path following terrain; the flight including flying the image forming sensor by an agricultural aircraft that is configured for aerial application of crop protection products; the set of imaging sites along the flight path being located less than 20 meters above the top of plantations growing in the agricultural area; where the acquisition includes: (a) acquiring image data at the set of imaging sites while flying the aerial imager sensor across the imaging sites at speeds that do not fall below 50% of the average speed of the aerial platform along the flight path; and (b) compensating for movement of the aerial imager sensor with respect to the crops during acquisition by illuminating the crops during acquisition and mechanically moving at least one component of the aerial imager sensor with respect to an aerial transport platform; the transmission including transmitting the image data content to the external system for display to an agronomist at a remote location agronomic image data that is based on the image data content, thereby enabling the agronomist to remotely analyze the agricultural area; the method further including: prior to flying defining a surveillance flight plan for an air surveillance system, the surveillance flight plan including acquisition location plan indicative of a plurality of imaging locations, wherein The aerial sensor flight is part of flying the aerial surveillance system along a flight path over an agricultural area, based on the surveillance flight plan.
[0030] According to an aspect of the invention, a method for agricultural monitoring is disclosed, the method including: (a) defining a surveillance flight plan for an air surveillance system, the surveillance flight plan including site plan acquisition indicative of a plurality of imaging sites; (b) based on the surveillance flight plan, flying the air surveillance system along a flight path over an agricultural area in which crops grow; (c) based on the acquisition site plan, acquire in-flight image data of parts of the agricultural area in submillimeter image resolution by the air surveillance system; and (d) transmit to an external system image data content that is based on the image data acquired by the air surveillance system.
[0031] According to a further aspect of the invention, the definition of the surveillance flight plan is preceded by receiving surveillance requests associated with a plurality of independent entities, and includes defining the surveillance flight plan to indicate locations of formation of images for plantations from each of the plurality of independent entities.
[0032] According to a further aspect of the invention, the agricultural area includes a plurality of fields in which at least two types of plantations grow, the definition of the surveillance flight plan including defining different acquisition parameters for training sites of images associated with different types of plantations.
[0033] According to one aspect of the invention, an agricultural monitoring system is disclosed, the agricultural monitoring system including: (a) an image forming sensor, configured and operable to acquire image data at submillimeter resolution of parts image of an agricultural area in which crops grow, when the image forming sensor is airborne; (b) a communication module, configured and operable to transmit to an external system image data content that is based on the image data acquired by the aerial image forming sensor; and (c) a connector operable to connect the image forming sensor and communication module to an aerial platform.
[0034] According to a further aspect of the invention, the agricultural monitoring system includes an airborne aerial platform that is operable to fly the aerial image forming sensor along a flight path over an agricultural area.
[0035] According to a further aspect of the invention, the agricultural monitoring system additionally includes a detachable, operable coupling to detachably couple the aerial imager sensor to an aerial platform.
[0036] According to a further aspect of the invention, the image forming sensor is configured and operable to acquire the image data at an altitude less than 20 meters above the top of plantations growing in the agricultural area.
[0037] According to a further aspect of the invention, the image forming sensor is configured and operable to acquire the image data while flying at speeds exceeding 10 m/s.
[0038] According to a further aspect of the invention, the agricultural monitoring system further includes at least one mechanical coupling that couples at least one component of the image forming sensor to a motor, whereby the movement of the motor mechanically moves the at least an image forming sensor component with respect to the aerial platform concurrently with the acquisition of image data by the image forming sensor.
[0039] According to a further aspect of the invention, the agricultural monitoring system further includes a motor operable to mechanically rotate at least one optical component of the image forming sensor with respect to the aerial platform, to compensate for the movement of the image forming sensor with respect to plantations during acquisition; whereby the image forming sensor is configured and operable to: (a) initiate a focusing process concurrently with the rotation of the at least one optical component when an acquisition optical geometry axis is at a degree greater than 20° from the axis vertical geometry, and (b) acquire the image data using vertical imaging, when the acquisition optical geometry axis is less than 20° from the vertical geometry axis.
[0040] According to a further aspect of the invention, the agricultural monitoring system includes a lighting unit, configured and operable to illuminate the crops during image data acquisition by the image forming sensor.
[0041] According to a further aspect of the invention, the image forming sensor is configured and operable to acquire the image data using vertical imaging.
[0042] The agricultural monitoring system according to claim 23, further including a processor that is configured and operable to process image data content to detect leaf diseases or indication of leaf parasite effect on one or more plants in the agricultural area.
[0043] According to a further aspect of the invention, the agricultural monitoring system includes a processor that is configured and operable to process image data content to identify selected agronomic significant data, and to generate agronomic image data for transmission to a remote system based on the selected significant agronomic data.
[0044] According to one aspect of the invention, a method for agricultural monitoring is disclosed, the method including: (a) receiving image data content that is based on image data of an agricultural area, the image data being are submillimeter image resolution image data acquired by an aerial imaging sensor at a set of imaging sites along a flight path extending over the agricultural area; (b) processing image data content to generate agronomic data that includes agronomic image data; and (c) transmit agronomic data to a remote end-user system.
[0045] According to a further aspect of the invention, the processing includes analyzing image data content to identify selected agronomically significant data within the image data content; and processing the significant agronomic data to provide the agronomic data.
[0046] According to a further aspect of the invention, the processing includes applying computerized processing algorithms to image data content to detect leaf diseases or indication of leaf parasitic effects on one or more plants in the agricultural area.
[0047] According to a further aspect of the invention, the reception includes receiving image data content of the agricultural area acquired on different days, the processing including processing the image data content to determine growth parameters for plants in the agricultural area.
[0048] According to a further aspect of the invention, the method further includes applying computerized processing algorithms to agronomic data to select, from a plurality of possible recipients, a recipient for the agronomic image data, based on the agronomic experience of the possible recipients .
[0049] According to a further aspect of the invention, the image data content includes first image data content of a first farm property of a first owner, and second image data content of a second farm property of a second owner other than the first owner; wherein the transmission includes transmitting the first image data content in a first message, and transmitting the second data content in a second message.
[0050] According to a further aspect of the invention, the image data content is based on image data acquired at a set of imaging locations along the flight path that are located less than 20 meters above the top of plantations growing in agricultural area. Brief description of drawings
[0051] To understand the invention and see how it can be carried out in practice, configurations will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
[0052] Fig. 1A is a functional block diagram illustrating an example of a system in an exemplary environment, in accordance with examples of the subject matter presently disclosed;
[0053] Fig. 1B is a functional block diagram illustrating an example of a system in an exemplary environment, in accordance with examples of the subject matter presently disclosed;
[0054] Fig. 1C is a functional block diagram illustrating an example of a system in an exemplary environment, in accordance with examples of the subject matter presently disclosed;
[0055] Fig. 2 is a flow diagram illustrating an example of a method for agricultural monitoring, in accordance with examples of the subject matter presently disclosed;
[0056] Fig. 3 is a flow diagram illustrating an example of a method for agricultural monitoring, in accordance with examples of the subject matter presently disclosed;
[0057] Fig. 4A illustrates a system, an agricultural area, and a flight path, in accordance with examples of the subject matter presently disclosed.
[0058] Fig. 4B illustrates a system, an agricultural area, a flight path, a server, and a plurality of exemplary entities to which significant agronomic data that is based on the image data acquired by the system can be transmitted, according to examples from the subject matter in issue currently disclosed;
[0059] Figs. 5A to 5E illustrate optional stages of a method for agricultural monitoring with examples of the subject matter presently disclosed;
[0060] Fig. 6 is a flow diagram illustrating an example of a method for agricultural monitoring with examples of the subject matter presently disclosed;
[0061] Fig. 7 is a flow diagram illustrating an example of a method for agricultural monitoring, in accordance with examples of the subject matter presently disclosed;
[0062] Fig. 8 is a flow diagram illustrating an example of a method for agricultural monitoring, in accordance with examples of the subject matter presently disclosed;
[0063] Fig. 9 is a functional block diagram illustrating an example of an agricultural monitoring system, in accordance with examples of the subject matter presently disclosed;
[0064] Fig. 10 is a functional block diagram illustrating an example of an agricultural monitoring system, in accordance with examples of the subject matter presently disclosed;
[0065] Figs. 11A, 11B, 11C and 11D are functional block diagrams illustrating examples of an agricultural monitoring system with motion compensation mechanisms, in accordance with examples of the subject matter presently disclosed;
[0066] Fig. 12 is a functional block diagram illustrating an example of an agricultural monitoring system, in accordance with examples of the subject matter presently disclosed;
[0067] Fig. 13 illustrates various images acquired by an aerial imaging sensor, in accordance with a method for agricultural monitoring, in accordance with examples of the subject matter presently disclosed;
[0068] Fig. 14 illustrates the cutting of individual sheets from the image data, in accordance with examples of the subject matter presently disclosed;
[0069] Fig. 15 is a flow diagram illustrating an example of a method for agricultural monitoring, in accordance with examples of the subject matter presently disclosed;
[0070] Fig. 16 is a functional block diagram illustrating an example of a server used for agricultural monitoring, in accordance with examples of the subject matter presently disclosed;
[0071] Fig. 17 is a flow diagram illustrating an example of a method for monitoring ground areas, in accordance with examples of the subject matter presently disclosed;
[0072] Fig. 18 is a flow diagram illustrating an example of a method for monitoring ground area, in accordance with examples of the subject matter presently disclosed;
[0073] Fig. 19 is a functional block diagram illustrating an example of a server used for monitoring a ground area, in accordance with examples of the subject matter presently disclosed.
[0074] It will be appreciated that for simplicity and clarity of illustration, the elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Additionally, where considered appropriate, reference numerals may be repeated between the figures to indicate corresponding or similar elements. Detailed description of invention settings
[0075] In the following detailed description, numerous specific details are recorded to provide a complete understanding of the invention. However, it will be understood by those skilled in the art that the present invention can be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail in order not to obscure the present invention.
[0076] In the drawings and descriptions presented, like reference numerals indicate those components that are common to different configurations or embodiments.
[0077] Unless specifically noted otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions use terms such as "processing", "calculation", "computation", "determination", "generation", "definition", "configuration", "selection", "definition", or similar, include action and/or processes of a computer that manipulates and/or transforms data into other data, said data represented as physical quantities , eg, such as electronic quantities, and/or such data representing physical objects. The terms "computer", "processor", and "controller" are to be interpreted expansively to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, a personal computer, a server, a system a computing device, a communication device, a processor (eg, digital signal processor (DSP), a microcontroller, a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), etc. ), any other electronic computing device, or any combination thereof.
[0078] Operations in accordance with the teachings herein can be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose, by a computer program stored in a storage medium read by computer.
[0079] As used herein, the phrases "for example", "such as", "exemplifying" and variants thereof describe non-limiting configurations of the subject matter presently disclosed. Reference in the specification to "a case", "some cases", "other cases" or variants thereof means that a particular aspect, structure or characteristic described in connection with the configuration(s) is (are) included ) in at least one embodiment of the presently disclosed subject matter. Therefore, the appearance of the phrase “one case”, “some cases”, “other cases” or variants thereof does not necessarily refer to the same configuration(s).
[0080] It is appreciated that certain features of the presently disclosed subject matter, which are, for clarity, described in the context of separate configurations, may also be provided in combination in a single configuration. Conversely, various features of the presently disclosed subject matter, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination.
[0081] In configurations of the subject matter presently disclosed one or more stages illustrated in the figures may be performed in a different order and/or one or more groups of stages may be performed simultaneously and vice versa. The figures illustrate a general scheme of the architecture of the system in accordance with a configuration of the subject matter presently disclosed. Each module in the figures can be made up of any combination of software, hardware and/or firmware [resident boot program] that performs the functions as defined and explained here. Modules in the figures can be centered in one location or scattered in more than one location.
[0082] Fig. 1A is a functional block diagram illustrating an example system 10 in an exemplary environment, in accordance with examples of the presently disclosed subject matter. System 10 is an airborne system, which includes an aerial platform 100, which carries the image forming sensor 210. As discussed below in greater detail, the image forming sensor 210 flies through an aerial platform 100 over an agricultural area, so as to allow the image forming sensor 210 to acquire image data of the agricultural area. The image data content that is based on the acquired image data is then transferred from the system 10 to a remote location, where it can be analyzed to obtain significant agronomic data.
[0083] Different types of aerial platforms can be used as aerial platform 100. For example aerial platform 100 can be an aerial platform of any of the following types of aerial platforms: an airplane, a helicopter, a multirotor helicopter (p .eg, a quadricopter), an unmanned aerial vehicle (UAV), a powered parachute (also referred to as a paraglider, PPC, and paraplane), and so on. The type of aerial platform 100 can be determined based on various considerations such as aerodynamic parameters (eg, speed, flight altitude, maneuverability, stability, carrying capacity, etc.), degree of manual control or automation , additional uses required from the aerial platform, and so on.
[0084] In addition to the image forming sensor 210, the system 10 additionally includes the processor 220 and communication module 230, all of which are connected to the aerial platform 100. The connection of any of the image forming sensor 210, processor 220 and communication module 230 (or any other component of system 10 carried by aerial platform 100) with aerial platform 100 may be a detachable connection, but this is not necessarily so. For example, any of the aforementioned components 210, 220 and/or 230 can be designed to be easily installed on and removed from an aerial platform 100 which can be used for various uses when the relevant components of the system 10 are not installed thereon.
[0085] Fig. 1B is a functional block diagram illustrating an example of system 10 in an exemplary environment, in accordance with examples of the subject matter presently disclosed. As can be seen in the example in fig. 1B, some of the components of system 10 (and especially image forming sensor 210) may be included in an independent detachable capsule 280 that can be attached and detached from one or more aircraft, based on need. Such a free-standing capsule 280 may consist of the agricultural monitoring system 200, which is discussed below, e.g., with respect to figs. 9-11C.
[0086] Fig. 1C is a functional block diagram illustrating an example of system 10 in an exemplary environment, in accordance with examples of the subject matter herein disclosed. In the example in fig. 1C, some of the components that enable agricultural use of system 10 are located in an external capsule 280, while other functionalities are enabled by components of aerial platform 100 (in the illustrated example, communication module 230).
[0087] As exemplified in figs. 1B and 1C, detachable cap 280 may be detachable cap with respect to aerial platform 100. For example, detachable cap 280 may be detachably attached to the fuselage of an aerial platform 100 (e.g., to the underside, as exemplified in Fig. 1B), or to a wing of aerial platform 100 (as exemplified in Fig. 1C).
[0088] It is noted that system 10 may include additional components, such as an altimeter, an airspeed indicator, longitudinal, lateral and/or yaw sensors, an interface for connecting to avionics and other systems of the aerial platform 100 , etc.
[0089] Fig. 2 is a flow diagram illustrating an example method 500 for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Referring to examples recorded with respect to the previous drawing, method 500 can be performed by system 10. Discussion and further details pertinent to system 10 are provided below, following discussion pertinent to method 500.
[0090] Stage 510 of method 500 includes flying an aerial imager sensor along a flight path over an agricultural area in which crops grow. Referring to the examples recorded with respect to the previous drawings, the aerial image forming sensor can be the image forming sensor 210, and the flight of stage 510 can be performed by the aerial platform 100.
[0091] It is noted that plantations of different types may grow in the aforementioned agricultural area, and that plantations may include plantations of one or more types of plants. For example, the agricultural area can be arable land (land under annual crops such as cereals, cotton, potatoes, vegetables, etc.), land used for permanent growth plantations (eg, orchards, wineries, fruit plantations , etc.). It is noted that the agricultural area can also be a marine (or otherwise water-based) agricultural area, eg a water surface used for the cultivation of algae species (algae culture). Additionally, although method 500 can be used for agricultural monitoring of cultivated land, it is noted that it can also be used for agricultural monitoring of uncultivated land (eg, natural forests, pastures, and meadows, etc.). In such cases, plants growing in such areas can be monitored like plantations in these areas. The agricultural area being agriculturally monitored in method 500 can include one or more types of agricultural areas (eg, any one or more of the above examples, eg, including both an orchard and a potato field).
[0092] Stage 520 of method 500 includes acquiring by the aerial image forming sensor image data from parts of the agricultural area, the acquisition of the image data including acquiring by the aerial image forming sensor at least part of the image data in a set of imaging sites along the flight path that allow the acquisition of image data at submillimeter image resolution. Referring to the examples registered with respect to the previous drawings, the acquisition of stage 520 can be performed by the image forming sensor 210.
[0093] The image data acquired at stage 520 may include one or more independent images, one or more video sequences, a combination thereof, and may also include any other type of image data known in the art. Acquisition of image data at stage 520 may include acquiring visible light or other electromagnetic radiation (eg, ultraviolet (UV) light, infrared (IR) light, or other parts of the electromagnetic spectrum). Other image acquisition technologies can also be used, in addition to or instead of light acquisition. For example, stage 520 may include acquiring image data by a synthetic aperture radar (SAR) sensor.
[0094] The acquisition of the image data at stage 520 includes acquiring at least part of the image data in submillimeter resolution. That is, in at least part of the image data acquired by the aerial imager sensor, parts of the agricultural area have images formed at a level of detail that allows resolving details of those parts of the agricultural area that are finer (ie, smaller) than a square millimeter (mm2). It is noted that the resolvable details of image data can be significantly less than one square millimeter, eg less than 0.01 square millimeter.
[0095] It is noted that stage 520 may include acquiring by the aerial imager sensor data from parts of the agricultural area at an image resolution that is finer by at least an order of magnitude than the average crop leaf size in the image. That is, in at least part of the image data, a plurality of leaves of the plantation are imaged at a resolution that allows to resolve at least ten resolvable parts independently of the leaf. A different intensity can be measured for each of these resolvable parts of the sheet. Optionally, stage 520 may include acquiring by the aerial imager sensor agricultural area image data at an image resolution that is finer by at least two orders of magnitude than an image's average plantation leaf size. Optionally, stage 520 may include acquiring by the aerial imager sensor image data of parts of the agricultural area at an image resolution that is finer than at least three or more orders of magnitude than an image average plantation leaf size. .
[0096] Image data in which a single plantation leaf is imaged with a plurality of individually resolvable areas (eg, more than 100 individually resolvable areas) allows to use the image data to detect the leaf condition of the plantation. planting, eg identifying different leaf diseases, identifying insects and parasites on leaves, identifying indications of effects of parasites on leaves (eg eaten parts), and so on.
[0097] It is noted that stage 520 may include acquiring image data from parts of the agricultural area in more than one resolution and/or in more than one image acquisition technology. In such cases, different images (or videos) of the same part of the agricultural area that are taken at different resolution and/or technology may be taken concurrently or at different times (eg at different parts of the flight path, possibly flying in other direction, altitude, etc.). Images at different resolutions and/or different parts of the electromagnetic spectrum can be acquired by a single sensor (eg taken at different times, using different lenses, using different optical filters, using different electronic filters, and so on).
[0098] Stage 540 of method 500 includes transmitting to an external system image data that is based on the image data acquired by the aerial image forming sensor. Referring to the examples recorded with respect to the previous drawings, transmission from stage 540 may be performed by communication module 230. The image data content that is transmitted at stage 540 may include some or all of the image data acquired at stage 520. Alternatively (or in addition), the image data that is transmitted at stage 540 may include image data content that is created by processing the image data acquired at stage 520.
[0099] Transmission from stage 540 may include transmitting image data content in a wireless manner while the aerial platform carrying the airborne platform is still in the air. However this is not necessarily so, and part (or all) of the image data content transmitted at stage 540 may be transmitted after the aircraft has landed. Transmission of the image data content may include transmitting the image data content in a wireless manner (eg using radio communication, satellite based communication, cellular network, etc.), in a wired manner ( especially if transmitting the data after the aircraft has landed, eg using universal serial bus (USB) communication, or any combination thereof. The transmission of image data content at stage 540 can be performed in real time or near real time (transferring image data corresponding to a part of the imaged agricultural area before acquiring image data corresponding to another part of the imaged agricultural area), but this is not necessarily so.
[00100] As will be discussed below in more detail, image data content can be transmitted for different types of entities, and for different uses by such entities. For example, image data content can be transmitted to an off-site system to be examined by an expert and/or processed by a computerized system to determine significant agronomic data for the agricultural area and/or for the crops within. her. In another example, image data content can be transferred to an aerial application system (eg an agricultural aircraft or a ground control system) to determine aerial application parameters for aerial application of pesticides (spraying planting) and/or fertilizer (aerial application of fertilizer to the topsoil). It is noted that aerial application can refer to the application of various kinds of materials from an aircraft - fertilizers, pesticides, seeds, etc. such aircraft can be airplanes or helicopters - but other types of aircraft can also be used (eg hot air balloons). It is noted that in the context of the present disclosure, agricultural aircraft (and especially aerial application aircraft) can be a manned aircraft, but also an unmanned aircraft.
[00101] Fig. 3 is a flow diagram illustrating an example method 600 for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Referring to examples recorded with respect to the previous drawing, method 600 can be performed by system 10. Method 600 is an example of method 500, and the stages of method 600 are numbered in reference numerals corresponding to those of method 500 ( that is, stage 610 is an example of stage 510, stage 620 is an example of stage 520, and so on). It is noted that variations and examples discussed with reference to method 500 (either above or below in the disclosure) are also relevant to method 600, where applicable.
[00102] Method 500, as implemented in example method 600, includes using an aerial imager sensor carried by an aircraft flying at very low altitudes to acquire extremely high resolution images of agricultural crops at high rate (producing large samples areas of the agricultural area in a relatively short time). The image data content generated in the air system is transmitted for processing to an off-site remote analytics server. The image data content is then processed by the analytics server, and then distributed to a management interface (eg a personal computer, a handheld computer, and so on) where it is provided to an agronomist , to a manager to another professional or to a dedicated system for further analysis. The high resolution of the images acquired at stage 620 allows for analysis at the individual leaf level, which can be used, for example, to detect leaf diseases and/or indication of parasite effect on leaves, etc.
[00103] As discussed in greater detail below, not every agricultural area is necessarily image-formed, and a representative sample of it can be selected. It is noted that agronomists who inspect an agricultural area (eg, a field, an orchard) for leaf diseases generally sample the agricultural area on foot, producing samples along a sampling trajectory designed to represent parts of the area. agricultural. Using an aerial imaging sensor that provides submillimeter resolution images of leaves across the agricultural area at high rates is not only faster than walking the agricultural area, but it also allows you to image parts of the agricultural area that are inaccessible to pedestrians. For example, leaves from tree tops can have formed images, as well as plants that are located within dense vegetation or over rough terrain.
[00104] Stage 610 of method 600 includes flying an aerial imaging sensor over an agricultural area in which crops grow along a flight path that includes a plurality of low-altitude imaging sites that allow for the acquisition of the image data in submillimeter image resolution. The flight path may include low-altitude continuous flight legs (a flight leg being a segment of a flight plan between two points on the path). Referring to the examples recorded with respect to the previous drawings, the aerial image forming sensor can be the image forming sensor 210, and the flight of stage 610 can be performed by the aerial platform 100.
[00105] Optionally, stage 610 may include flying the aerial imager sensor along an air trajectory following terrain (also referred to as "flying"). The altitude of such a flight path following terrain above terrain (measured either above the face of the earth, or above vegetation, according to circumstances) may differ, based on different considerations (such as aerodynamic concerns, optical requirements of the forming sensor of image, dimensions of plantations, etc.). For example, stage 610 may include flying the aerial imager sensor above the agricultural area at altitudes lower than 30 meters (30 m) above the ground. For example, stage 610 may include flying the aerial imager sensor above the agricultural area at altitudes lower than 20 m above ground. For example, stage 610 may include flying the aerial imager sensor above the agricultural area at altitudes lower than 10 m above ground. It is noted that the height of the flight path following terrain can also be measured above the top of plantations growing in the agricultural area (eg less than 10 m, 20 m, or 30 m above the top of such plantations).
[00106] Fig. 4A illustrates system 10, agricultural area 900, and flight path 910, in accordance with examples of the subject matter presently disclosed. In the illustrated example, agricultural area 900 includes two separate areas - wheat field 901 and orchard 902.
[00107] Flight path 910 also includes two main types of flight legs - 911 imaging flight legs along which the aerial image forming sensor acquires agricultural area image data, and transitional flight legs 912, in which the aerial platform flies from one end of one imaging flight leg 911 and/or to the beginning of another imaging flight leg 912. Image forming flight legs 911 are illustrated with solid arrows, while the transitional flight legs are illustrated using dashed arrows. Transition flight legs 912 can be planned over areas that are not of interest for agronomic needs, but possibly also above the agricultural area of interest, eg if enough data has already been sampled for this area.
[00108] It is noted that the two parts of agricultural area 900 (ie, areas 901 and 902) may belong to different entities. For example, wheat field 901 might be owned by farmer MacGregor, while orchard 902 might be an agricultural company's research orchard. Thus, in a single flight, method 500 (and therefore also method 600) may include collecting image data from independent entities' farms.
[00109] Clearly, field 901 and orchard 902 differ from each other in both agricultural and agronomic aspects. Imaging these two different areas may therefore require different operational parameters - of the aerial platform (eg speed, altitude above ground, stability, etc.) and/or the aerial imaging sensor (eg. exposure time, f-number, lens focal length, resolution, detector sensitivity, speed compensation, etc.). It is noted that the acquisition of image data at stage 520 (and therefore also at stage 620) may include acquiring image data from different parts of the agricultural area using different acquisition modes (different from each other in aerodynamic and/or sensor parameters, eg as discussed above).
[00110] Reverting to fig. 3, stage 620 of method 600 includes acquiring by the aerial imager sensor image data of parts of the agricultural area at submillimeter resolutions. It is noted that parts of the agricultural area may also have images formed at lower resolutions (eg to generate guidance images, to which submillimeter image data can be associated). However, most of the section of agricultural area that is imaged at stage 620 preferably is imaged as submillimeter resolution. As mentioned above with respect to method 500, optionally this image-formed agricultural area section can be a sample of the agricultural area for which agronomic analysis is obtained in method 600. The same parts that are imaged in submillimeter resolution can also be imaged formed at lower resolution, as discussed above. Referring to the examples recorded with respect to the previous drawings, the acquisition of stage 620 can be performed by the image forming sensor 210.
[00111] Agricultural area imaging at stage 620 includes acquiring imaging data of representative parts of the agricultural area (eg, sampled at different sampling locations across the agricultural area) at an image resolution that is enough to analyze individual leaves of image-formed plantations (eg thinner by at least one or two orders of magnitude than an average leaf size of image-formed plantation). Fig. 13 illustrates several images 1000 acquired by an aerial imager sensor, in accordance with method 600, in accordance with examples of the subject matter presently disclosed. As can be seen from the different illustrations, leaves from different plant species can be analyzed for different types of leaf conditions (eg dryness, pests, diseases, etc.).
[00112] Reverting to fig. 3, it is noted that the image resolution of the image data acquired by the aerial image forming sensor depends on several factors - some of which depend on the image forming sensor itself (eg lens, pixel density [“PICture ELement” = element of figure] of the detector, etc.), and some of which depend on the aerial platform (eg altitude above ground, speed, stability, etc.).
[00113] A ground sampling distance (GSD) can be defined for the acquired image data as the distance between measured pixel centers on the ground. For example, in image data (corresponding to a single image or video data) with 500 nanometer GSD, adjacent pixel image locations are 500 nanometers apart on the ground. It is noted that the image's GSD is not equal to its resolution, as resolution data from adjacent pixels have additional requirements (eg optical resolution quality of the lens used for image formation). GSD is also referred to as Projected Ground Sample Range (GSI) or Projected Ground Instantaneous Field of View (GIFOV).
[00114] As a general consideration, given a specific image forming sensor, the GSD is approximately inversely proportional to the distance between the image forming sensor and the image-formed subject. The fly-by at stage 510 can facilitate the acquisition of image data at sub-millimeter resolution. Optionally, the GSD of the image data acquired at stage 620 is less than 0.75 mm (ie each pixel covers ground area less than 0.75 x 0.75 mm2). Optionally, the GSD of the image data acquired at stage 620 is less than 0.5 mm (ie each pixel covers ground area less than 0.5 x 0.5 mm2).
[00115] Stage 630 of method 600 includes processing the image data by an overhead processing unit to provide image data content that includes high quality images of crop leaves. The aerial processing unit is carried by the same aerial platform that flies the aerial image forming sensor over the agricultural area. Referring to the examples recorded in relation to the previous drawings, stage 630 can be performed by the processor 220.
[00116] Stage 630 processing may include filtering the image data (eg to discard image data that is not of sufficient quality, or selecting a representative image for each area), compressing the image data, improving the data (e.g., applying image enhancement processing algorithms thereto), selecting agronomic significant data, or any combination of the above, as well as other possible processing techniques that are known in the art.
[00117] For example, stage 630 processing may include processing the acquired image data to filter out image data that is not of sufficient quality, analyzing the remaining images to identify leaves from agricultural area plantations (eg, based on sheet identification parameters preloaded into the processing module) on some of the acquired images, select the images that include identifiable sheets in a high quality representative sample, and compress the selected images to provide the image data content to be transmitted to an external system.
[00118] Stage 640 of method 600 includes wirelessly transmitting image data content to an offsite remote server for distribution to end users. Referring to the examples recorded with respect to the previous drawings, transmission from stage 640 can be performed by communication module 230. Wireless transmission of image data content at stage 640 can be performed in different ways (e.g. , using radio communication, satellite-based communication, cellular network, etc.).
[00119] From the server, image data content - or significant agronomic data that is based on image data content - can be distributed to various entities such as farmers, agronomists, aircraft pilots, aerial systems, etc. .
[00120] Fig. 4B illustrates system 10, agricultural area 900, flight path 910, server 300, and a plurality of exemplary entities to which significant agronomic data that is based on image data acquired by system 10 can be transmitted, according to examples from matter in question currently disclosed.
[00121] Optionally, various computerized processing algorithms can be applied by the server to the image data to identify selected agronomic significant data, and generate agronomic image data for transmission to a remote system based on the selected agronomic significant data.
[00122] For example, image data content (whether processed or not) can be provided to an agronomist 992 (in the illustrated example this is done via a 994 satellite connection). The 992 agronomist (eg, an agronomist specializing in quinoa residing in another country) can review the data provided, and in turn recommend what next step should be taken. Such information can be provided to a 993 farmer or agricultural land owner, or directly to another entity (eg aerial application instruction to spray crops with crop protection products, provided directly to a 991 agricultural aircraft that can apply such products to the agricultural area).
[00123] It is noted that the system 100 aerial platform of system 10 can be used as an agricultural aircraft used for aerial spraying. The acquisition of the image data by the aerial imager sensor in such a case can be performed while in an aerial application flight (either concurrently with the aerial application, or at other times in the flight). In this way, a dedicated aerial imager sensor can be installed on an agricultural aircraft that is intended to fly over the agricultural area, and the flight can thus be used for the added benefit of collecting image data of agronomic interest.
[00124] Such direction or recommendations do not necessarily require the involvement of an agronomist, and optionally other entities (eg farmer 993 or server 300 itself) can analyze information that is based on the image data acquired by the system 10, to provide recommendations, instructions, analyses, or other information that can be used to improve a condition of the agricultural area and/or the crops growing in it.
[00125] Additionally, the information collected with respect to the imaged agricultural area formed by system 10 can be used to determine how to improve an area condition other than the imaged agricultural area formed. For example, if the formed image data allowed to identify aphids in the agricultural area, nearby fields can also be sprayed based on this information.
[00126] Figs. 5A to 5E illustrate optional stages of method 500 for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Figs. 5A to 5E illustrate additional stages and variations on stages presented above that can be implemented as part of method 500. It is noted that not all of these stages and variations are necessarily implemented together in a single implementation of the invention. All combinations of stages or variations that are discussed in connection with method 500 can be implemented, and form part of this disclosure.
[00127] Referring to stage 510, optionally the flight path is a flight path following terrain. In other words, stage 510 may include optional stage 511 of flying the aerial image forming sensor along a flight path following terrain. The altitude of the flight path following terrain above ground can be lower than a predetermined height during imaging flight legs, eg lower than 20 m above ground (or above the height of plantations where applicable, eg above dense forest).
[00128] It is noted that stage 510 may include flying the aerial platform at altitudes that reduce the effects of optical aberrations of the image forming sensor and vibrations of the image forming sensor and/or aerial transport platform on the image data formed to allow the acquisition of the imaging data at submillimeter resolution.
[00129] As discussed in more detail below with respect to stage 520, optionally image data is acquired by the aerial imager sensor while the aerial platform is in motion, possibly without requiring the aerial platform to slow down. In this way, system 10 as a whole can image larger parts of the agricultural area at a given time. Thus, stage 510 may include stage 512 of flying at speeds exceeding 10 m/s through each imaging site outside the aforementioned set of imaging sites (in which the acquisition of the image data in resolution submillimeter image is performed).
[00130] Assuming a measured speed of the aerial platform along an imaging flight leg which may include a plurality of the aforementioned imaging locations, the flight of stage 510 may include stage 513 of flying the forming sensor of aerial imaging along the imaging sites of that imaging flight leg at speeds that do not fall below 50% of the average speed along that imaging flight leg.
[00131] Stage 510 may include stage 514 of flying the aerial imager sensor by an agricultural aircraft that is configured for aerial application of crop protection products. It is noted that the acquisition of stage 520 in this case can be performed in parallel with aerial application (usually performed at very low altitudes above the plantations, eg at altitudes of 3-5 meters above the plantations, and possibly even lower ), or in other parts of the flight (eg when the agricultural aircraft is in transition between two fields). As discussed in greater detail below, the application itself can be based on processing image data acquired in method 500, either real-time processing of image data acquired by the same air system, or processing image data acquired on previous flights.
[00132] Referring to the examples registered with respect to the previous drawings, each stage between stage 511, 512, 513 and 514 can be performed by aerial platform 100.
[00133] As mentioned above, the agricultural area can include different areas that are associated with different entities. It is therefore noted that stage 510 may include flying the aerial imager sensor along a flight path extending over at least a first farm of a first owner and a second farm of a second owner other than the first. owner. In such a case, the acquisition at stage 520 may include acquiring first image data from parts of the first farm and acquiring second image data from parts of the second farm, and the method may further include generating first image data content based on the first image data and generating second image data content based on the second image data. This allows to provide the first image data content to a first entity in a first message, and to provide the second image data content to a second entity in a second message. Each of the first message and the second message can include information identifying the owner of the respective farm, and/or can be routed to a system and/or another entity associated with the respective owner. It is noted that the distinction between first image data content and second image data content is not necessarily performed by the onboard system 200, and may also be performed by the server 300.
[00134] Referring now to stage 520 which includes acquiring by the aerial image forming sensor image data from parts of the agricultural area, where the acquisition of the image data includes acquiring by the aerial image forming sensor at least part of the image data in a set of imaging sites along the flight path that allow the acquisition of image data at submillimeter image resolution.
[00135] As mentioned above, image data can be acquired when the aerial platform is progressing along the flight path at a regular pace, without reducing the speed of its flight. Optionally, stage 520 may include stage 521 of acquiring the image data (some or all of them) at the set of imaging sites while flying the aerial image forming sensor over the imaging sites at speeds that do not fall below 50% of the average speed of the aerial platform along the flight path.
[00136] It is noted that reducing speed may not be required at all, and stage 520 acquisition can be performed without reducing a flight speed at which the aerial imaging sensor is flying along the flight path. Optionally, the acquisition of stage 520 may include compensating for motion of the image forming sensor during image data acquisition. This can be achieved, for example, using one or more motion compensation techniques.
[00137] Such various techniques for motion compensation can be used, for example, to avoid image blur that results from acquiring images while the aerial platform carrying the aerial imager sensor is flying forward.
[00138] One such technique that can be used as part of method 500 for motion compensation is to move the aerial image forming sensor (or part thereof) during the image data acquisition process. The movement of the aerial image forming sensor (or one or more relevant parts thereof) can be performed when image data is actually collected (eg when an aerial image forming sensor detector, such as an attached device by charge, CCD, is collecting light arriving from the agricultural area), but it can also be performed in other parts of the image data acquisition process (eg during a focusing process that precedes the light collection).
[00139] This type of motion compensation can be achieved by moving one or more parts of the aerial image forming sensor without rotating the optical axis of the light collecting parts of the sensor (eg moving the sensor in a direction opposite to the direction of flight) and/or moving or rotating parts of the aerial imager sensor so as to rotate its optical light-gathering axis (eg by rotating a mirror or prism that directs light arriving from a location of formed image of the agricultural area on a light recording portion of the sensor, such as a CCD).
[00140] Stage 520 may therefore include stage 522 of mechanically moving at least one component of the aerial image forming sensor with respect to an aerial transport platform, to compensate for movement of the aerial image forming sensor with respect to the crops during the acquisition.
[00141] Motion compensation in stage 520 can reduce the relative speed between the formed image location and the light recording part to substantially zero, or simply reduce it enough such that the effects of the relative motion between the two over image quality are less than a pre-set threshold.
[00142] If, as mentioned earlier, motion compensation by rotating parts of the aerial imager sensor starts during the focusing stage, it is noted that focusing can start while the optical geometric axis of light acquisition is diagonal to the horizon , and the actual acquisition of the image data can take place in the part of the rotation movement in which the optical axis against the formed image planting (eg the formed image sheet) is perpendicular to the horizon.
[00143] Optionally, the acquisition of stage 520 may include: mechanically rotating at least one optical component of the aerial image forming sensor (eg rotating mirror 213, mirror prism 212, etc.) with respect to an aerial platform of transport, to compensate for the movement of the aerial image forming sensor with respect to the crops during acquisition, and concurrently with the rotation of the at least one optical component, for each frame of a plurality of frames of image data: initiating a focusing process of the imager sensor when an acquisition optical geometry axis is at a degree greater than 20° from the vertical geometry axis, and acquire the image data using vertical imaging when the acquisition optical geometry axis is at one degree less than 20° from the vertical axis. The acquisition optical axis is the line connecting a center of an image location formed from the agricultural area in a given frame (the area covered by the specific image frame), and a center of an aperture (eg transparent window 219 ) through which light enters the imaging system against the rotating optical component.
[00144] Generally, whether motion compensation is used or not, acquiring the image data at stage 520 may include acquiring some or all of the image data using vertical imaging (either strictly vertical or sharp oblique imaging, p. .eg, less than 20 degrees from vertical).
[00145] In addition or conversely, other motion compensation techniques may optionally be implemented as part of method 500. For example, stage 520 may include stage 523 of lighting crops during acquisition, to compensate for sensor motion. aerial imager with respect to plantations during acquisition. Stage 523 lighting can include flash lighting, steady lighting (at least for the duration of the acquisition, but can be significantly longer), or other types of lighting. Optionally, lighting can start when a focusing process that precedes image acquisition starts.
[00146] As mentioned above, acquiring image data (and especially submillimeter resolution image data) from the aerial platform as disclosed with respect to method 500 allows to collect image data of agricultural and agronomic significance in places that are of another forms that are unattainable, inaccessible, or where access would be slow, dangerous, expensive and/or harmful to plantations. For example, leaves on tree tops can have formed images, as well as plants that are located within dense vegetation or over rough terrain. Optionally, stage 520 may include stage 524 of acquiring image data of parts of the agricultural area that are inaccessible to land vehicles. While it may be possible to design and manufacture a land vehicle that reaches the treetops of rainforest trees, it is complicated and expensive to do so, and possibly harmful to the natural environment. The inaccessibility of stage 524 is especially pertinent to land vehicles that are commonly used in agriculture, such as tractors, pickup trucks, center pivot irrigation equipment, combine harvesters, cotton pickers, etc. It is noted that stage 520 may include acquiring image data of parts of the agricultural area that are inaccessible on foot (ie, for a man walking, climbing, etc.).
[00147] As also mentioned above, the image data acquired by the image forming sensor at stage 520 does not necessarily represent all of the agricultural area, and they can also image representative sample of it.
[00148] The relative part of the agricultural area that has an image formed by the image forming sensor may differ between different types of plantations. A different definition of minimum coverage area can be defined for each type of plantation. A markup that can be used for such a definition of coverage area from a full field is the comparison with the coverage that can be achieved by a human terrestrial inspector walking on foot, or higher percentage. For example, if an inspector on foot is expected to examine 2-3% non-randomized fields that are focused on the area outside the field where the inspector on foot can pass on foot and/or by car, the flight path may be planned in such a way that it will generate a random coverage including also the inside of the field (and not just the outside coverage) of at least 3-5%.
[00149] Optionally, stage 520 can include stage 525 of acquiring image data of the agricultural area at a coverage rate below 500 square meters per hectare (i.e., less than 5% of the agricultural area is covered by the image data ).
[00150] It is noted that stage 520 may include stage 526 of focusing the image forming sensor, prior to collecting light by the image forming sensor. It is noted that focusing the image forming sensor can be difficult, especially if performed while the aerial platform is flying at significant speeds (eg above 10 m/s, at speeds that do not fall below 50% of the average speed of the aerial platform. aerial platform along the flight path, etc., eg as discussed in relation to stage 521). Focusing can be affected not only from movement that results from the movement of the aerial platform relative to the formed image location, but also from movement within the image formation system (eg, as discussed in relation to stage 522). It is noted that the operational parameters of the imaging system (eg system 200) and/or the aerial transport platform can be selected to allow focusing. For example, the maximum altitude above the tops of plantations can be selected to allow efficient focusing of the image forming sensor during flight.
[00151] Stage 530 of method 500 includes processing the image data by an overhead processing unit to provide image data content that includes high quality images of crop leaves. The aerial processing unit is carried by the same aerial platform that flies the aerial image forming sensor over the agricultural area. Referring to examples registered with respect to the previous drawings, stage 530 can be executed by processor 220.
[00152] Stage 530 processing may include filtering the image data (eg to discard image data that is not of sufficient quality, or selecting a representative image for each area), compressing the image data, improving the image data (e.g., applying image enhancement algorithms thereto), selecting agronomically significant data, or any combination of the above, as well as other possible processing techniques that are known in the art.
[00153] For example, stage 530 processing may include processing the acquired image data to filter out acquired images that are not of sufficient quality, analyzing the remaining images to identify leaves from agricultural area plantations (eg, based on parameters sheets preloaded into the processing module) on some of the acquired images, select the images that include identifiable sheets in a high-quality representative sample, and compress the selected images to provide the image data content to be transmitted to an external system.
[00154] During the flight path and image collection the aerial system can optionally perform an initial image analysis, eg to set photo quality, blur level and image resolution to exclude images that are not in the Minimum requirement of remote image analysis server, thus saving analysis time and data transfer to remote locations, either server or end product interface.
[00155] As mentioned above, stage 540 includes transmitting to an external system image data content that is based on the image data acquired by the aerial image forming sensor.
[00156] Stage 540 may include stage 541 of transmitting the image data content to the external system for display to an agronomist at a remote location of agronomic image data that is based on the image data content, thereby enabling the agronomist remotely analyze the agricultural area. It is noted that the image data content can be transmitted to the external system directly or via an intermediary system (eg a server), and that the external system can display the agronomic image data directly to the agronomist, or provide to another system information that allows the agronomist to display agronomic image data (eg, an agronomist's handheld computer, such as a smartphone [smart phone]). It is noted that such agronomic image data (eg selected images of infected leaves) can be transmitted to one or more agronomists and/or to other entities, eg as discussed with respect to fig. 4B. Additional details regarding optional stage 541 are discussed below.
[00157] Few of the optional stages that can be included in method 500 are illustrated in fig. 5D. It is noted that different routes are illustrated between stage 530 and any higher-numbered stage is optional, and that some stages can be reached in different ways in different implementations of method 500. For example, stage 580 can be executed directly following the stage 540, or following an intermediate stage run from stage 550 (which may include different combinations of stage 550 substages). It is noted that despite all illustrated routes (indicating execution order) of stages in fig. 5D, these routes do not exhaust all possible options, and additional routes can also be chosen, depending on various considerations that will naturally present themselves to a person who is experienced in the art.
[00158] Image data content can be processed and used in various ways. It can be used as a basis for various decisions and actions - such as in what ways crops should be treated, what additional monitoring of the agricultural area is required, how crops in adjacent (or even remote) agricultural areas should be addressed, when what crops are expected to mature, how many crops the agricultural area is expected to produce and at what times, and so on.
[00159] Method 500 may include processing image data content (or information based thereon) to provide decision facilitating information. Processing may be performed by an airborne processor carried by the air system (denoted 551), by a server (denoted 552), and/or by an end-user device (denoted 553). Processing can involve human input (eg, at the end user's device, where the agronomist can feed instructions based on his analysis of the content of the image data, or tick to the producer farmer which signals to watch for a suggested treatment. it's working.
[00160] For example, the processing of stage 550 may include detecting individual sheets, and cutting only the sheets from the image data, as exemplified in fig. 14. Fig. 14 illustrates the cutting of individual sheets from the image data, in accordance with examples of the subject matter presently disclosed. Image 1000 can be processed to detect leaf edges (image 1010), and then parts that have been imaged can be removed, to provide image including only individual leaf information (image 1020). Leaf cropping - or other image processing algorithms applied to image data may be based on multivariate, multiple station, leaf image, parameter, and/or data database.
[00161] Stage 530 processing can provide, for example, any one or more of the following: sheet size statistics, sheet density statistics, sheet color and spectral analysis, and morphological statistics.
[00162] Image data content can be processed and used in various ways. Optionally, it can be transmitted to one or more entities, e.g. as discussed above (for example, with respect to fig. 4B). The image data content can also be used to determine parameters that are pertinent to acquiring additional image data in another instance of method 500, it could be aerodynamic parameters or operational parameters for the aerial platform, it could be operational parameters for another loaded system by aerial platform (eg, agricultural spray parameters), and so on.
[00163] Method 500 may include stage 560 for obtaining operational parameters for the air system and/or systems installed thereon, based on image data collected in stage 520. Referring to the examples recorded in relation to the previous drawings, the stage 560 can be performed by communication module 230.
[00164] Optionally method 500 may include planning a next flight based on the image data content obtained in method 500. The planning may be based on the image data content and agricultural considerations and/or additional considerations. Method 500 may include another instance of stage 510, 520, 530 and 540 following stage 540. In such a case, method 500 may include stage 561 of planning a trajectory for the subsequent instance of flight, based on the image data acquired in a previous instance of acquisition.
[00165] Optionally, method 500 may include step 562 of selecting aerial application parameters for aerial application of crop protection products by agricultural aircraft based on processing the image data.
[00166] All stages discussed above are performed aboard the aerial platform that carries the image forming sensor used for the acquisition of stage 520 (where stage 550 can also be performed - partially or completely - in remote systems). Other process stages can also be performed by other entities (not carried by the aerial platform), such as a server or end user units.
[00167] Optional stage 570 includes transmitting to an end-user device, via a server that is located away from the aerial platform, decision-enabling information that is based on the content of image data. Referring to registered examples with respect to the previous drawings, stage 570 can be performed by server 300. Transmission can be performed wirelessly and/or through wired communication means, and can be facilitated by one or more intermediary systems (eg internet routers, etc.). Several examples of the information that is based on image data content and that can facilitate decisions are provided above, as well as examples for the decision that can then be made.
[00168] Optional stage 580 includes analyzing information that is based on image data content to provide agronomic and/or agricultural decisions. Referring to the examples recorded with respect to the previous drawings, stage 580 can be performed by an end-user device such as a computer used by a farmer or an agronomist (whether a portable computer or not), by a user interface ( UI) connected directly to the server, and so on. It is noted that stage 580 analysis can be fully computerized (eg using only dedicated hardware, software and/or firmware), or to involve human inputs of varying degrees (eg the agronomist analyzing images received from leaves , based on years of professional experience). The outputs from stage 580 can be broadcast to any of the other entities (eg, the server, the air system, and so on).
[00169] Method 500 may include stage 590 to present to an agronomist at a remote location (ie, remote from the agricultural area, possibly in another country) agronomic image data that is based on the content of the image data, enabling this form the agronomist to remotely analyze the agricultural area. Referring to the examples recorded with respect to the previous drawings, stage 590 can be performed by an end-user device such as a computer used by a farmer or an agronomist (whether a portable computer or not), by a user interface ( UI) connected directly to the server, and so on.
[00170] Reverting to stage 550 which includes processing image data content or information based thereon to provide decision-making information, it is noted that processing may include various processing procedures.
[00171] Stage 550 may include stage 554 of applying computerized processing algorithms to image data content (either directly, or indirectly to information that is based on image data content) to detect disease and/or indication of effect of parasites on leaves on one or more plants in the agricultural area. It is noted that disease detection at stage 554 can be used as a basis for further analysis, whether computerized or not. For example, computerized processing algorithms can be used to detect leaves that have been eaten by parasites, and these images can then be transferred to an agronomist to assess the type of parasite, and what measures should be taken to aid crops.
[00172] Stage 550 may include stage 555 for determining health parameters at the large-scale level (eg, for the entire field, for a hectare of forest, a municipality, or country, etc.), based in high resolution images of many individual plants in the agricultural area. Such parameters may include, for example purposes: irrigation failures or general irrigation level, nitrogen, leaf disease above a certain coverage that is significant for the whole field, by planting and time in the growing season, as in possible cases of Late aphid, or in the case of insects such as Colorado Beetle, in which it achieves a certain identification, in a scattered location in the field as defined by GPS location of the photos in which it was identified, will define the entire field as infected. An additional parameter is the percentage of early growth stage emergency defined by the dispersed flight pattern of low flight and allows you to define full field level emergency.
Stage 550 may include stage 556 of processing image data acquired at different times over multiple weeks to determine growth parameters for plants in the agricultural area. Image data can be acquired over multiple weeks by repeating the flight, acquisition and transmission stages multiple times over multiple weeks. Repetition of these stages can be performed by a single air system, or by different air systems.
Stage 550 may include stage 557 of applying computerized processing algorithms to image data (directly or indirectly, e.g., to image data content) to identify selected agronomic significant data, and generate image data for transmission to a remote system based on selected significant agronomic data. This can be used, for example, to determine what data to send for examination by the agronomist.
[00175] Stage 550 may also include selecting which recipient of the processed information, eg to which agronomist (or other specialist or system) the information is to be communicated. Stage 550 may include stage 558 of applying computerized algorithms to selected agronomic significant data to select, from a plurality of possible recipients, a recipient for the agronomic image data, based on the agronomic experience of the potential recipients.
[00176] Fig. 6 is a flow diagram illustrating an example of method 700 for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Referring to registered examples with respect to the previous drawing, method 700 can be performed by system 10. Method 700 is an example of method 500, and the stages of method 700 are numbered in reference numerals corresponding to those of method 500 ( that is, stage 710 is an example of stage 510, stage 720 is an example of stage 520, and so on). It is noted that variations and examples discussed with reference to method 500 (either above or below in the disclosure) are also relevant to method 700 where applicable.
[00177] Method 500, as implemented in example method 700, includes using an aerial image forming sensor carried by an aircraft flying at very low altitudes to acquire extremely high resolution images of agricultural crops at high rate (sampling large areas of the agricultural area in a relatively short time). The image data content generated by the aerial system is transmitted for processing on an off-site remote analytics server. The image data content is then processed by the analytics server, after which it is distributed to a management interface (eg, a personal computer, a handheld computer, and so on) where it is provided. for an agronomist, for a manager, for another professional or for a dedicated system for further analysis. The high resolution of the images acquired at stage 720 allows for individual leaf analysis, which can be used, for example, to detect leaf diseases or indication of parasite effect on leaves.
[00178] Stage 710 of method 700 includes flying, by an agricultural aircraft (eg, a spray aircraft as illustrated in Fig. 6), an airborne digital camera over a field of potatoes in which potatoes grow, in speeds between 10 and 15 m/s along a flight path that includes a plurality of low-altitude imaging sites about 40 feet above the level of the plantation, which allows acquisition of the image data at submillimeter resolution of image.
[00179] Stage 720 of method 700 includes acquiring by airborne digital camera image data of parts of the potato field at submillimeter resolutions of about 0.4mm. The ground area covered by the digital camera in a single image is illustrated by the trapeze drawn over the field.
[00180] Stage 730 of method 700 includes processing the image data by an airborne processing unit aboard the agricultural aircraft, to provide image data content that includes high quality images of potato leaves. In the illustrated example, part of the image acquired at stage 720 is cut off, such that only the area around suspicious points detected in the acquired image is prepared for transmission at stage 740. In the illustrated example, the suspect points are actually areas of leaves that demonstrate early stages of aphids.
[00181] Stage 740 of method 700 includes wirelessly transmitting to a remote off-site server the content of the image data for distribution to end users such as an agronomist.
[00182] Fig. 7 is a flow diagram illustrating an example of method 800 for agricultural monitoring in accordance with examples of the subject matter presently disclosed. Referring to the examples recorded with respect to the previous drawing, method 800 can be performed by system 10. Discussion and further details pertinent to system 10 are provided below, following discussion pertinent to method 800.
[00183] Method 800 includes a stage for defining a surveillance flight plan (stage 805 which is discussed below), which is followed by acquisition and use of image data of an agricultural area based on the surveillance flight plan. The stages of method 800 that follow stage 805 may be variations of corresponding stages of method 500 (corresponding stages of these two methods are numbered in corresponding reference numerals, that is, stage 810 corresponds to stage 510, stage 820 corresponds to stage 520, and so on). It is noted that variations and examples discussed with reference to method 500 are also relevant to method 800, where applicable, MUTATIS MUTANDIS [changing what has to be changed]. Where applicable, the relevant variations of stages 510, 520 and possibly also 530, 540 and following stages, can be implemented in the corresponding stages of method 800 (i.e., 810, 820, and so on) as executed also based on the plan of surveillance flight defined in stage 805.
[00184] Stage 805 of method 800 includes defining a surveillance flight plan for an air surveillance system, the surveillance flight plan including acquisition site plan indicative of a plurality of imaging sites.
[00185] Referring to the examples recorded with respect to the previous drawings, stage 805 can be performed by different entities, such as air system 10, server 300, end-user device (eg, from agronomist 992, from farmer 993, from an unillustrated planning center, and so on), or any combination thereof (eg, a plan may be suggested by agronomist 992, and then revised by aerial system 10 based on weather conditions).
[00186] The definition of stage 805 can be based on several considerations. For example, the surveillance flight path and possibly additional parameters can be defined in order to allow image acquisition in the required qualities. Stage 805 may include, for example, the following substages: • Based on information obtained from the client, define the desired agricultural areas (also referred to as “plots”); • Receive geographic information system (GIS) information from plots, as well as information regarding the plot structure (such as GIS information regarding irrigation pipes, roads, or other aspects of the plot structure). • Optionally receive information regarding the plantations growing in the agricultural area, such as the type of plantations, age of the plantation (since it was planted), variety, etc. • Based on GIS information (possibly using additional information), define plot topography and obstacles on each plot and around plots such as field installed irrigation systems, tall trees, power lines, fixed machinery and others. • Define a surveillance flight plan using a flight plan tool, the surveillance flight plan being defined with respect to each plantation and by plotting with various guidelines per plantation (eg potatoes or other flat plantations presented are Targeted on 5-20 HA plots, each plot receives high altitude photo by single shot from high altitude.High altitude single shot is planned by GPS coordinates to center of field with magnetic bearing of whole field to get a straight high-altitude shot in a single shot. Low-altitude flight is planned at this stage to define a flight path. Low-altitude flight path is planned as an X pattern, 10-20 meters of clearance between photos (for extreme low-altitude resolution) These settings change by plantation family or at the specific request of each customer. It is noted that optionally the same flight path is conducted on each plot several times atr through the entire season.
[00187] It is noted that the surveillance flight plan may be updated. For example, on the day of the actual flight (if the flight plan is defined in advance), the flight crew and/or local contact can reach the agricultural area, and check obstacles for low flight, check wind to optimize flight paths for wind bow or tail (eg, preferably taking pictures with a bow wind rather than a crosswind).
[00188] Stage 810 of method 800 includes flying the air surveillance system, based on the surveillance flight plan, along a flight path over an agricultural area in which crops grow. Referring to the examples recorded with respect to the previous drawings, the aerial surveillance system can be the image forming sensor 210 or the entire aerial system 10, and the flight of stage 810 can be performed by the aerial platform 100. It is noted that all the optional variations, implementations, and substages discussed with respect to stage 510 can be adapted to be relevant to stage 810, which is executed based on the surveillance flight plan.
[00189] Stage 820 of method 800 includes acquiring by the in-flight air surveillance system based on the acquisition site plan, image data of parts of the agricultural area in submillimeter image resolution. Referring to the examples recorded with respect to the previous drawings, the aerial surveillance system may be the image forming sensor 210 or the entire aerial system 10. It is noted that all variations, implementations and optional substages discussed in relation to stage 520 may be adapted to be relevant to stage 820, which is executed based on the surveillance flight plan.
[00190] Method 800 may include optional stage 830 (illustrated in Fig. 8), which includes processing the image data by an overhead processing unit, to provide image data content that includes high quality images of sheets of the plantations. The aerial processing unit is carried by the same aerial platform that flies the aerial surveillance system over the agricultural area. Referring to the examples recorded with respect to the previous drawings, stage 830 can be executed by processor 220. It is noted that all optional variations, implementations and substages discussed in relation to stage 530 can be adapted to be relevant to stage 830. stage 830 can be performed based on the surveillance flight plan defined in stage 805, but this is not necessarily so. For example, the processing of the optional stage 830 can be based on information related to the type of plantations or types of diseases sought, which are included in the surveillance flight plan. It is noted that the surveillance flight plan (or a more general defined plan for the surveillance flight, a plan that includes the surveillance flight plan as well as additional information) may include parameters and/or instructions that affect the processing of the option stage 830 (eg instructions on how much information is to be transmitted to an external system in stage 840).
[00191] It is noted that method 800 may also include processing the image data to provide other decision-making information, similar to the processing discussed with respect to stage 550 (eg with respect to stage 551). Like stage 830, such image data processing may be based on the surveillance flight plan, but this is not necessarily so.
[00192] Stage 840 of method 800 includes transmitting to an external system image data content that is based on the image data acquired by the air surveillance system. Referring to the examples recorded with respect to the previous drawings, the transmission of stage 840 can be performed by communication module 230. It is noted that all optional variations, implementations and substages discussed with respect to stage 520 can be adapted to be relevant to the stage 820, which is executed based on the surveillance flight plan.
[00193] Method 800 may also include stages 850, 860, 870, 880 and 890, which correspond to stages 550, 560, 570, 580 and 590 respectively. Each of stages 850, 860, 870, 880, and 890 may include substages that correspond to the previously discussed substages of the corresponding stages 550, 560, 570, 580, and 590 of method 500. Each of stages 850, 860, 870, 880, and 890 (and its substages) may be based on the surveillance flight plan defined in stage 805, but this is not necessarily so.
[00194] Fig. 8 is a flow diagram illustrating an example method 800 for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Method 800 may optionally include stage 801 (which precedes stage 805), which includes receiving watch requests associated with a plurality of independent entities. Stage 805 in such a case may include stage 806 for defining the surveillance flight plan to indicate imaging locations for plantations of each of the plurality of independent entities. Such entities, as discussed above, can be different agricultural areas (eg a field and an orchard), agricultural areas of different customers (eg a field of one customer and another field owned by another customer), and so on.
[00195] As discussed with respect to method 500 above (eg with respect to fig. 4A), more than one type of crop can grow in the agricultural area. Stage 805 may include stage 807 of defining different acquisition parameters for imaging sites associated with different types of crops.
[00196] Such acquisition parameters may include operational parameters of the aerial platform (eg, speed, altitude above ground level, stability, etc.) and/or parameters of the air surveillance system and especially its sensor (p. eg exposure time, f-number, focal length, resolution, detector sensitivity, speed compensation, etc.).
[00197] Fig. 9 is a functional block diagram illustrating an example agricultural monitoring system 200, in accordance with examples of the subject matter presently disclosed. Some components of the agricultural monitoring system 200 (also referred to as system 200, for convenience) may have analogous structure, function, and/or role in system 10 (and vice versa), and therefore the same reference numerals were used for indicate such analogous components. It is noted that different components of system 200 can run different stages of methods 500, 600, 700, and 800 (eg, as indicated below), and that system 200 as a whole can run processes that include two or more stages of these methods.
[00198] Agricultural monitoring system 200 includes at least one image forming sensor 210, communication module 230 and connector 290, and may include additional components such as (though not limited to) those discussed below.
[00199] The image forming sensor 210 is configured and operable to acquire image data at submillimeter resolution image of parts of an agricultural area 900 in which crops grow, when the image forming sensor is airborne. Image forming sensor 210 is airborne in the sense that it is operable to acquire image data while flying an aircraft. It is however noted that the image forming sensor 210 can also be used to capture images also when not being carried by an aircraft. Additionally, a standard image forming sensor (eg, a standard digital camera such as Canon EOS 60D or Nikon D3200) can be used as the image forming sensor 210.
[00200] It is noted that although at least part of the image data acquired by the image forming sensor 210 is acquired in submillimeter resolution, the image forming sensor 210 may optionally also acquire image data of the agricultural area at lower resolutions (p. eg 2mm, 1cm GSD, etc.). Image forming sensor 210 can be configured to acquire lower resolution image data, if implemented, using the same configuration as used for submillimeter resolution acquisition (eg if the aerial platform carrying image forming sensor 210 flying at lower altitude), or using another setting. Such other setting can be used, for example, to acquire orientation quality images (eg having 2 cm GSD), for which high resolution image data can be recorded.
[00201] As discussed above with respect to stage 520 of method 500, the image forming sensor 210 may be operable to acquire image data from parts of the agricultural area at an image resolution that is finer by at least an order of magnitude than an average leaf size of the image-formed plantation. That is, in at least part of the image data, a plurality of leaves of the plantation are imaged at a resolution that allows to resolve at least ten resolvable parts independently of the leaf. A different intensity level can be measured for each of these resolvable parts of the sheet. Optionally, image forming sensor 210 may be operable to acquire image data of parts of the agricultural area at an image resolution that is finer by at least two orders of magnitude than an average leaf size of the image formed crop (and optionally thinner by at least three orders of magnitude).
[00202] Different kinds of image forming sensor 210 can be used as part of system 200. For example, image forming sensor 210 can be a semiconductor charge-coupled device (CCD) image sensor, an image sensor a complementary metal-oxide-semiconductor (CMOS) image sensor or an N-type metal-oxide-semiconductor (NMOS) image sensor. It is noted that more than one image forming sensor 210 may be included in system 200. For example, system 200 may include a first aerial image forming sensor for low-altitude photography of the agricultural area, and a second image forming sensor 210 for high altitude orientation photography of the agricultural area (and possibly its environment as well). Additionally, system 200 may include image forming sensors 210 of different types. For example, system 200 may include image forming sensors 210 that are sensitive to different parts of the electromagnetic spectrum.
[00203] In addition to any optical elements that may be incorporated in the image forming sensor 210, the system 200 may additionally include additional optical elements (eg elements 211, 212 and 213 in Fig. 11A) to direct light from of the agricultural area on a light-collecting surface of the imager sensor 210 (eg, optional lens 211). Such additional optical elements can manipulate the light they collect before directing it onto the image forming sensor 210. For example, additional optical elements can filter out parts of the electrical spectrum, can filter and/or change the polarization of the collected light, and so on.
[00204] Optionally, system 200 can be used to image parts of the agricultural area in a low-altitude flight (eg lower than 10 m above ground, eg lower than 20 m above ground). the ground, eg lower than 30 m above the ground). Optionally, image forming sensor 210 can be configured and operable to acquire image data at an altitude less than 20 meters above the top of crops growing in the agricultural area.
[00205] The selection of the operating altitude for the 200 system can depend on several factors. First, the altitude of the aerial system above the agricultural area is determined by the amount of light reaching the image forming sensor 210, and therefore the exposure and aperture time that can be used to collect light during the acquisition of the image data. Therefore, although low flight may limit the image forming sensor's field of view, it allows the acquisition of image data using short exposure time and small aperture, thereby facilitating image data capture by the system 200 when flying at considerable speeds.
[00206] Another consideration when determining operational flight altitude is noise and cancellation thereof, especially when acquiring image data when flying at considerable speeds (eg above 10 m/2). As discussed with respect to motion compensation, one of the ways in which aerial platform motion compensation during acquisition can be achieved is by rotating the imager sensor 210 with respect to the agricultural area, or by rotating an optical component (eg. ., mirror prism 212 or rotating mirror 213) which directs light from the agricultural area onto the image forming sensor 210. In such cases, the rotation speed of the rotating optical elements must compensate for the angular velocity of the air system with respect to a fixed point in the agricultural area (eg, the center of an acquired image data frame). Given a fixed linear velocity v of the aerial platform (assuming it flies perpendicular to the ground), the angular velocity of the aerial platform with respect to the ground is inversely proportional to the altitude of the aerial platform above the ground.
[00207] However, the actual angular velocity of the aerial platform with respect to the agricultural area depends not only on its speed and flight altitude, but also on noise and movement (longitudinal tilt, lateral tilt, spin, vibrations, drift, etc.) . Angular velocity therefore consists of the component resulting from the speed of flight of the aerial platform and a component resulting from such noises. If V is the horizontal flight velocity of the aerial platform and R is its altitude above ground, then the angular velocity is wreal = vV00 + vruid0 = V/R + wru[d0 . Therefore, flying at low altitude reduces the relative effect of noise on angular velocity, and improves image quality. It is noted that the angular velocity of the rotating optical component can also be determined based on information regarding noise, such as information regarding the movement of the aerial platform collected by IMU 270.
[00208] The system 200 further includes the communication module 230, which is configured and operable to transmit to an external system image data content that is based on the image data acquired by the aerial image forming sensor. The external system is a system that is not part of system 200, optionally one that is not installed on the aircraft carrying system 200. It is noted that communication module 230 may be operable to transmit image data content directly to a system that is located away from the aerial platform carrying the system 200. Optionally, the communication module 230 may be operable to transmit image data content to such a remote system by communicating via an aerial platform communication module (or a system installed on the aerial platform). For example, if the aerial platform is equipped with radio connection and/or satellite communication channel with a ground unit, then the communication module 230 can transmit the image data content to the radio unit and/or the satellite communication unit, which in turn would transmit it to the ground unit. As discussed in connection with stage 540 of method 500, communication module 230 may be operable to wirelessly transmit image data content to the external system. As discussed in connection with stage 540 of method 500, communication module 230 may be operable to transmit image data content to the external system in real time (or near real time).
[00209] Different types of communication modules 230 can be used as part of system 200. For example, internet communication module can be used, fiber optic communication module, and satellite-based communication module can be used.
[00210] The communication module 230 may optionally be an aerial communication module in the sense that it is operable to transmit image data while flying in an aircraft. It is however noted that the communication module can transmit the image data content also when the aircraft is back on the ground. System 200 may be connected to the aerial platform when the communication module transmits the image data content, but this is not necessarily so.
[00211] System 200 additionally includes connector 290 which is operable to connect image forming sensor 210 to an aerial platform. The connection of the imager sensor 210 with the aerial platform is a mechanical connection (that is, these two objects remain spatially close to each other because of the connector), even if the connecting means are not mechanical (eg, connection electromagnetic or chemical).
[00212] Different types of connectors 290 can be used to connect the imager sensor 210. For example, any of the following types of connectors (as well as any combination thereof) can be used for connector 290: glue; welding; one or more screws, mechanical latches, staples, claws, rivets, clips, and/or screws; hook and loop fasteners; magnetic and/or electromagnetic fasteners, and so on.
[00213] It is noted that connector 290 can connect imager sensor 210 to the aerial platform directly (i.e., when the sensor touches the platform either directly or with the connector as the only separation) or indirectly (e.g., connecting a system housing 200 to the aerial platform, where the imager sensor 210 is connected to the housing).
[00214] It is noted that connector 290 may connect other components of system 200 to the aerial platform - either directly or indirectly. For example, one or more connectors 290 can be used to connect communication module 230, optional processor 220, and/or an optional (not denoted) system frame 200 to the aerial platform. Each of these components can be connected via connector 290 to the aerial platform either directly or indirectly. It is noted that connector 290 can include many connecting pieces, which can be used to connect different parts of system 200 to the aerial platform. Referring to the example of figs. 1A-1C, connector 290 may include a solder that welds communication module 230 to the rear of the aircraft, as well as four screws connecting imager sensor 210 to the front of the aircraft.
[00215] It is noted that connector 290 may be operable to connect one or more components of system 200 to the aerial platform in a detachable manner (eg, using screws, hook and loop fasteners, fittings, etc.). It is noted that connector 290 may be operable to connect one or more components of system 200 to the aerial platform in a non-detachable manner (e.g., using solder, glue, etc. While such connector may be detached using specialized means, it is is not designed to do this regularly, or more than once). Imager sensor 210 can be connected to aerial platform using detachable and/or non-detachable connector 290. Using a detachable connector 290 can be useful, for example, if system 200 is a portable unit that is connected to different aircraft based on the needs (eg connected to agricultural spraying aircraft according to a spraying plan for the day).
[00216] Fig. 10 is a functional block diagram illustrating an example agricultural monitoring system 200, in accordance with examples of the subject matter presently disclosed.
[00217] Optionally, system 200 may include processor 220. Processor 220 is operable to receive image data acquired by image forming sensor 210, to process the data, and to transfer information that are based on the processing of image data (such information may include, for example, instructions, image data content, etc.). It is noted that optional processor 220 may base its processing on other sources of information in addition to the image data acquired by image forming sensor 210. Generally, processor 220 may be configured and operable to perform any combination of one or more of the processes of processing, analysis, and computation discussed with respect to stages 530 and 550 of method 500.
[00218] Processor 220 includes hardware components, and may also include dedicated software and/or firmware. The hardware component of processor 220 can be specially designed to speed up the processing of image data. Alternatively (or in addition) general purpose processors (eg, field-programmable port array, FPGA, AMD Opteron 16-core, Abu Dhabi MCM processor, and so on).
[00219] For example, processor 220 may be configured and operable to process image data content to detect leaf diseases and/or indication of effect of leaf parasites on one or more plants in the agricultural area. For example, processor 220 may be configured and operable to process image data content to identify selected agronomic significant data, and to generate agronomic image data for transmission to a remote system based on the selected agronomic significant data.
[00220] Optionally, the image forming sensor 210 can be configured and operable to acquire the image data while flying at speeds exceeding 10 m/s. Optionally, image forming sensor 210 can be configured and operable to acquire image data while flying at speeds that do not fall below 50% of the average air platform speed along the flight path or along a flight leg. of imaging 911, as discussed with respect to fig. 4B.
[00221] Acquiring images when an aerial transport platform is flying at relatively high speed can allow it to cover relatively large parts of the agricultural area. Covering large parts of the agricultural area can also be facilitated by sampling the agricultural area, acquiring image data from a representative sample of the same (eg as discussed in relation to Fig. 4B). For example, system 200 may be operable to acquire agricultural area image data at a coverage rate below 500 square meters per hectare.
[00222] Figs. 11A, 11B, 11C and 11D are functional block diagrams illustrating examples of agricultural monitoring system 200 with motion compensation mechanisms, in accordance with examples of the subject matter presently disclosed.
[00223] In the example of fig. 11A, motion compensation is achieved by rotating a mirroring prism through which light is directed to the image forming sensor 210. In the example of FIG. 11A, system 200 includes one or more mechanical connections 241 (a shaft, in the illustrated example) that connect at least one component of image forming sensor 210 (in this case - mirror prism 212) to a motor 240. By mechanical connection 241 , motion of motor 240 mechanically moves the at least one component of image forming sensor 210 (in the illustrated example - it moves mirror prism 212) with respect to an aerial transport platform (not shown in Fig. 11A). The movement of the motor moves the respective component (or components) of the image forming sensor 210 concurrently with the acquisition of image data by the image forming sensor 210. It is noted that just as a matter of convenience, the mirror prism 212, lens 211 and mirror 213 are illustrated outside the housing of the image forming sensor 210, and in fact they belong to the image forming sensor 210. It is noted that the optical component may belong to the image forming sensor 210, even if it is not enclosed in the same housing that retains a light-sensitive surface of the imager sensor 210.
[00224] It is noted that other components that deflect light onto a light sensitive surface of the image forming sensor 210 can be used in place of a prism (eg a rotating mirror). In fig. 11B, the entire image forming sensor 210 is driven by the motor 240 with respect to the aerial transport platform concurrently with the acquisition of the image data.
[00225] The movement of the one or more components of the image forming sensor 210 with respect to the aerial platform can be used to compensate for the movement of the aerial image forming sensor with respect to the crops during acquisition. Therefore, the image forming sensor 210 may be operable within the system 200 to acquire image data of the agricultural area when the aerial transport platform flies at relatively high speeds (eg, above 10 m/s), and therefore produce a high coverage rate of the agricultural area.
[00226] The speed at which the mechanical connection 241 moves the respective components of the image forming sensor 210 can be selected such that a relative speed between the light-collecting surface of the image forming sensor 210 (denoted 214 in Fig. 11B) and the formed image part of the agricultural area (in this instance of image data acquisition) is zero, or close to zero, but this is not necessarily so.
[00227] The image forming sensor 210 may include a focusing mechanism (not shown), to focus light arriving from part of the agricultural area onto a light sensitive surface of the image forming sensor 210. The focusing mechanism may be necessary, for example, to allow acquisition of image data when flying at varying altitudes above the ground. The focusing mechanism can be operated automatically (by a focusing control processor, not illustrated). The focus control processor is configurable and operable to focus optical elements of the image forming sensor 210 when light from a first part of the agricultural area is projected onto a light collecting surface of the image forming sensor 210, as that the image forming sensor 210 then acquires image data of a second part of the agricultural area that does not fully overlap the first part of the agricultural area. Referring to the example in fig. 11B, this can be used, for example, to focus the image when light arrives diagonally (with respect to the ground) to the image forming sensor 210, and acquire the image data when light from the agricultural area reaches the sensor image shaper 210 vertically.
[00228] Optionally, the motor 240 may be operable to mechanically rotate at least one optical component of the image forming sensor 210 with respect to the aerial platform (eg, via one or more mechanical connections 241), to compensate for movement of the image forming sensor 210 with respect to crops during acquisition. The image forming sensor 210 in such cases may be configured and operable to: (a) initiate a focusing process concurrently with rotation of the at least one optical component when an acquisition optical geometry axis is at a degree greater than 20° to from the vertical geometry axis, and (b) acquire the image data using vertical imaging when the acquisition optical axis is at a degree less than 20° from the vertical axis.
[00229] In the example of fig. 11C, motion compensation is achieved using lighting. Optionally, system 200 includes lighting unit 250 (eg, a projector and/or flash unit) that is configured and operable to illuminate crops during image data acquisition by the aerial imager sensor. For example, LED (light emitting diode) lighting can be used. Illumination can be used to compensate for movement of the aerial imager sensor with respect to crops during acquisition. Various types of lighting can be used (eg, depending on the relative importance of energy consumption considerations relative to other system 200 design factors). It is noted that the use of flash lighting can be used to increase the time that the light-sensitive surface 214 of the image forming sensor 210 must be exposed to light from the agricultural area 900 to produce an image, which is why. instead reduces the motion blur of the resulting image data.
[00230] In the example of fig. 11D, the agricultural monitoring system 200 includes an altimeter 250. For example, the altimeter 250 may be a laser altimeter, whose laser beam passes through a corresponding window of the agricultural monitoring system 200 (denoted “altimeter window” 252). System 200 may additionally include an inertial measurement unit (IMU) 270, which measures and reports aircraft speed, orientation, and gravitational forces using a combination of one or more accelerometers, gyroscopes, and/or magnetometers. System 200 may also include a rotating encoder 230, which measures a rotation rate for a rotating mirror 213 (or for a rotating mirror prism 212, as discussed above).
[00231] Information from the IMU 270, altimeter 250 and rotary encoder 260 can be used by motor controller 248 to determine the rotation speed for motor 240 (and thus for rotary mirror).
[00232] It is noted that the angular velocity of the imaging plane (eg, that of the transparent window 219 that transfers light from the agricultural area 900 against the image forming sensor 210) depends on several factors, which include the aircraft's airspeed, its pitch angle, and its height above the 900 agricultural area. Additionally, information from a laser altimeter may also require correction based on pitch and pitch angle data.
[00233] Optionally, the rotation axis of the rotating mirror 213 is parallel to the horizon, and perpendicular to the main axis of the aircraft. However, since the aircraft's direction of flight is not necessarily parallel to the main geometric axis of the aircraft (eg, it may drift because of crosswind, or for maneuvering reasons), system 200 can also compensate for the component perpendicular to the main geometric axis of the aircraft.
[00234] Fig. 12 is a functional block diagram illustrating an example agricultural monitoring system 200, in accordance with examples of the subject matter presently disclosed. As mentioned above, optionally the agricultural monitoring system 200 may include an airborne aerial platform 100 that is operable to fly the aerial image forming sensor along a flight path over an agricultural area.
[00235] Different types of aerial platforms can be used as aerial platform 100. For example aerial platform 100 can be an aerial platform of any of the following types of aerial platform: an airplane, a helicopter, a multirotor helicopter (p .eg, a quadricopter), an unmanned aerial vehicle (UAV), a powered parachute (also referred to as a paraglider, PPC, and paraplane), and so on. The type of aerial platform 100 can be determined based on various considerations such as aerodynamic parameters (eg, speed, flight altitude, maneuverability, stability, carrying capacity, etc.), degree of manual control or automation , required additional uses of the aerial platform, and so on.
[00236] Optionally, aerial platform 100 included in system 200 may include an engine, operable to propel aerial platform 100 during its flight. Optionally, aerial platform 100 included in system 200 may include wings (either fixed or rotating), operable to provide support for aerial platform 100 during its flight.
[00237] Fig. 15 is a flow diagram illustrating an example method 1100 for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Referring to recorded examples with respect to the previous drawings, method 1100 can be performed by server 300. Referring to method 500, it is noted that execution of method 1100 can start after stage 540 to transmit image data content be completed, but may also start during the execution of stage 540. That is, the server may begin to receive, process, and use some image data content, before all image data content is generated by the air system. This may be the case, for example, if the air system processes and transmits image data content during the acquisition flight.
[00238] Method 1100 starts with stage 1110 for receiving image data content that is based on image data of an agricultural area, the image data being image data of submillimeter resolution image acquired by a forming sensor imagery at a set of imaging sites along a flight path extending over the agricultural area. Referring to the examples recorded with respect to the previous drawings, the image data content received at stage 1110 may be part or all of the image data content transmitted at stage 540 of method 500, and/or part or all of the content of the image data transmitted by communication module 230 of system 200. Stage 1110 can be performed by communication module 310 of server 300.
[00239] Method 1100 continues with stage 1120 to process image data content to generate agronomic data that includes agronomic image data. Referring to the registered examples with respect to the following drawing, stage 1120 can be performed by the server processing module 320. It is noted that different types of image data content processing can be performed at stage 1120. processing discussed with respect to stage 550 can be included in stage 1120.
[00240] Optionally, stage 1120 processing may include analyzing image data content to identify selected agronomic significant data within the image data content; and processing the significant agronomic data to provide the agronomic data.
[00241] Optionally, stage 1120 processing may include applying computerized processing algorithms to image data content to detect leaf diseases or indication of leaf parasite effect on one or more plants in the agricultural area.
[00242] Stage 1130 of method 1100 includes transmitting the agronomic data to a remote end-user system. Referring to registered examples with respect to the following drawing, stage 1130 can be performed by communication module 310 of server 300.
[00243] Referring to the examples registered with respect to the previous drawings, the agronomic data transmitted in stage 1130 can be transmitted to various entities such as an agricultural airplane 991, an agronomist 992, and/or a farmer 993.
[00244] It is noted that method 1100 is executed by a server (such as server 300) that supports the various variations discussed with respect to method 500. For example, with respect to detecting crop growth in the agricultural area, reception of stage 1110 may include receiving content from agricultural area image data acquired (by at least one image forming sensor) on different days (which may span several weeks), and processing of stage 1120 may include processing the content of image data to determine growth parameters for plants in the agricultural area.
[00245] With regard to another example of monitoring agricultural areas of multiple entities, it is noted that optionally, the image data content may include first image data content of a first farm of a first owner, and second content image data of a second farm of a second owner other than the first owner; and transmitting stage 1130 may include transmitting the first image data content in a first message, and transmitting the second image data content in a second message. Each of the first message and the second message can include information identifying the owner of the respective farm, and/or can be routed to a system and/or another entity associated with the respective owner.
[00246] Method 1100 may further include a stage of applying computerized processing algorithms to agronomic data to select, from a plurality of possible recipients, a recipient for the agronomic image data, based on the agronomic experience of the possible recipients. Transmission from stage 1130 can be performed based on the results of the selection.
[00247] Fig. 16 is a functional block diagram illustrating an example server 300 used for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Server 300 may include communication module 310 and server processing module 320, as well as additional components omitted for reasons of simplicity (e.g., power supply, user interface, etc.).
[00248] As discussed in greater detail above, the content of image data received can be based on image data obtained in low flight over the agricultural area. Especially, the image data content can be based on image data obtained at a set of imaging sites along the flight path that are located less than 20 meters above the top of plantations growing in the agricultural area.
[00249] The systems and methods discussed above have been described in the context of monitoring an agricultural area in which crops grow. It will be clear to a person with experience in the art that these methods and systems can also be useful (eg ergonomically useful) also for monitoring areas of soil that currently do not have any crops growing in them. For example, such systems and methods can also be used to determine the types of soils on these lands, their material composition, the level of irrigation in these areas, identify parasites or weeds, and so on. It is therefore noted that the systems described above can be adapted to monitor soil areas with or without plantations, MUTATIS MUTANDIS. In both cases, ground area imaging is still done in submillimeter resolution, and can be implemented in any of the modes discussed above (eg, using motion compensation, etc.). A few examples are provided with respect to figs. 17, 18 and 19.
[00250] Fig. 17 is a flow diagram illustrating an example method 1800 for monitoring a ground area, in accordance with examples of the subject matter presently disclosed. Referring to the examples registered with respect to the previous drawing, method 1800 can be performed by system 10.
[00251] Method 1800 includes a stage of defining a surveillance flight plan (stage 1805 which is discussed below), which is followed by acquiring and using image data of a ground area, based on the surveillance flight plan . The stages of method 1800 that follow stage 1805 may be variations of the corresponding stages of method 500 (corresponding stages of these two methods are numbered in corresponding reference numerals, i.e. stage 1810 corresponds to stage 510, stage 1820 corresponds to stage 520 , and so on) - with the modification that the land area is not necessarily an agricultural area in which crops grow. For example - it can be an agricultural area before (or after) crops are grown in it (eg after sowing), an area of soil adjacent to an agricultural area (and which can affect the area of soil, eg. , because of dust or parasites), or another type of ground area.
[00252] It is noted that variations and examples discussed with reference to method 500 are also relevant to method 1800, where applicable, MUTATIS MUTANDIS. Where applicable, the relevant variations of stages 510, 520 and possibly also 530, 540 and following stages can be implemented in the corresponding stages of method 1800 (ie 1810, 1820, and so on) as performed based on the flight plan surveillance set at stage 1805 - with the modification that the land area is not necessarily an agricultural area in which crops grow.
[00253] Stage 1805 of method 1800 includes defining a surveillance flight plan for an air surveillance system, the surveillance flight plan including acquisition site plan indicative of a plurality of imaging sites.
[00254] Referring to the examples recorded with respect to the previous drawings, stage 1805 can be performed by different entities, such as air system 10, server 300, and end-user device (eg, agronomist 992, from farmer 993, from an unillustrated planning center, and so on), or any combination thereof (eg, a plan may be suggested by agronomist 992, and then revised by air system 10 based on weather conditions) .
[00255] The definition of stage 1805 can be based on several considerations. For example, the surveillance flight path and possibly additional parameters can be defined in order to allow image acquisition in the required qualities. Stage 1805 can include, for example, the following substages: • Based on information obtained from the customer, define the desired one or more areas; • Receive information from the geographic information system (GIS) of one or more land areas, as well as information regarding the structure of one or more land areas (such as GIS information of information regarding irrigation pipes, roads, or others aspects of the structure). • Optionally receive information regarding the soil in the land area, such as soil type, variety, etc. • Based on GIS information (possibly using additional information as well), define the topography and obstacles in each of the one or more ground areas and around the one or more ground areas, such as field installed irrigation systems, tall trees , power lines, fixed machinery and others. • Define a surveillance flight path plan using a flight planning tool, the surveillance flight plan being defined with respect to each of one or more ground areas (or subdivisions thereof). It is noted that optionally, general guidelines can be included for different types of soils or for other distinct sub-areas in the one or more soil areas.
[00256] It is noted that the surveillance flight plan may be updated. For example, on the day of the actual flight (if the surveillance flight plan is defined in advance), the flight crew and/or local contact can reach the ground area, and check obstacles for low flight, check the wind to optimize the flight paths flying headwind or tailwind (eg, preferably taking pictures with headwind rather than crosswind).
[00257] Stage 1810 of method 1800 includes flying the air surveillance system, based on the surveillance flight plan, along a flight path over a ground area (the term "ground area" is explained in the previous paragraphs ). Referring to the examples recorded with respect to the previous drawings, the aerial surveillance system can be the image forming sensor 210 or the entire aerial system 10 (with the modification that the land area is not necessarily an agricultural area in which crops grow ), and the flight of stage 1810 can be performed by aerial platform 100. It is noted that all optional variations, implementations and substages discussed with respect to stage 510 can be adapted to be relevant to stage 1810, which is performed based on the surveillance flight.
[00258] Stage 1820 of method 1800 includes acquiring by the in-flight air surveillance system, based on the acquisition site plan, image data of parts of the ground area in submillimeter image resolution. Referring to the examples recorded with respect to the previous drawings, the aerial surveillance system can be the image forming sensor 210 or the entire aerial system 10 (with the modification that the land area is not necessarily an agricultural area in which crops grow ). It is noted that all optional variations, implementations and substages discussed in relation to stage 520 can be adapted to be relevant to stage 1820, which is executed based on the surveillance flight plan.
[00259] Method 1800 may include optional stage 1830, which includes processing the image data by an aerial processing unit, to provide image data content that includes high quality images of the ground and/or objects deposited on the ground. (or partially exposed on the ground). For example, a clod of earth, a small piece of earth (eg, 2 cm x 2 cm), organic layers or residues (soil horizon I, including L, F, and/or H layers), soil from top (soil horizon A), a rock, a stone, a pipe, a sprayer, live animals (eg, insects, caterpillars, parasites, etc.), and so on.
[00260] The stage 1830 aerial processing unit is carried by the same aerial platform that flies the aerial surveillance system over the ground area. Referring to the examples recorded with respect to the previous drawings, stage 1830 can be performed by a system processor of stage 1810 (eg, processor 220, MUTATIS MUTANDIS). It is noted that all the optional variations, implementations and substages discussed in relation to stage 530 can be adapted to be relevant to stage 1830.
[00261] Stage 1830 can be performed based on the surveillance flight plan defined in stage 1805, but this is not necessarily so. For example, optional stage 1830 processing can be based on information related to the type of soil or types of agricultural conditions sought (eg, soil moisture, soil uniformity, and so on), which is included in the surveillance flight plan. It is noted that the surveillance flight plan (or a more general defined plan for the surveillance flight, a plan that includes the surveillance flight plan as well as additional information) may include parameters and/or instructions that affect internship processing 1830 (eg instructions as to how much information is to be transmitted to an external system at stage 1840).
[00262] It is noted that method 1800 may also include processing the image data to provide other decision-making information, similar to the processing discussed in relation to stage 550 (eg, in relation to stage 551), MUTATIS MUTANDIS . Like stage 1830, such processing of image data may be based on the surveillance flight plan, but this is not necessarily so.
[00263] Stage 1840 of method 1800 includes transmitting to an external system image data content that is based on the image data acquired by the air surveillance system. Referring to the examples registered with respect to the previous drawings, the transmission of stage 1840 can be carried out by the communication module 230, MUTATIS MUTANDIS. It is noted that all the optional variations, implementations and substages discussed in relation to stage 520 can be adapted to be relevant to stage 1820, MUTATIS MUTANDIS, which is executed based on the surveillance flight plan.
[00264] Method 1800 can also include stages 1850, 1860, 1870, 1880 and 1890, which correspond to stages 550, 560, 570, 580 and 590 respectively (with the modification that the land area is not necessarily an agricultural area in which crops grow). Each of stages 1850, 1860, 1870, 1880, and 1890 may include substages that correspond to the substages discussed above of the corresponding stages 550, 560, 570, 580, and 590 of method 500 (with the modification that the ground area is not necessarily a agricultural area in which crops grow). Each of stages 1850, 1860, 1870, 1880 and 1890 (and their substages) may be based on the surveillance flight plan defined in stage 1805, but this is not necessarily so.
[00265] Referring to method 1800 as a whole, method 1800 (and particularly also the design of the surveillance flight plan) can be used, for example, to see if a sown agricultural area has already germinated, if an area of soil is suitable for agricultural use, to determine if pipelines and/or irrigation systems and/or irrigation systems and/or other agricultural systems are working, and so on.
[00266] For example, the ground area can include different types of ground, and acquisition can include acquiring image data from different locations in the ground area, to generate a ground map of the ground area (eg, either on the aerial platform and/or in a ground system).
[00267] For example, acquisition may include acquiring image data that is indicative of material composition from different locations in the ground area. Such material composition can include different types of soil and/or stones, different types of minerals, and so on.
[00268] For example, the acquisition may include image data that is indicative of the level of agricultural preparedness of different locations in the soil area.
[00269] It is noted that more than one type of soil (or other objects on it, over or partially exposed terrain) may be present in the soil area. Stage 1805 may include defining different acquisition parameters for imaging sites associated with different soil types (or other objects such as those mentioned earlier in this paragraph).
[00270] Such acquisition parameters may include operational parameters of the aerial platform (e.g., speed, altitude above ground level, stability, etc.) and/or parameters of the air surveillance system and especially its sensor (p eg exposure time, f-number, focal length, resolution, detector sensitivity, speed compensation, etc.).
[00271] Fig. 18 is a flow diagram illustrating an example 1900 method for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Referring to registered examples with respect to the following drawings, method 1900 can be executed by server 1300.
[00272] Referring to method 1800, it is noted that the execution of method 1900 may start after stage 1840 for transmitting the image data content is completed, but may also start during the execution of stage 1840. That is, the server can start receiving, processing and using some image data content, before all image data content is generated by the air system. This may be the case, for example, if the air system processes and transmits image data content during the acquisition flight.
[00273] Method 1900 starts with stage 1910 for receiving image data content that is based on image data of a ground area, the image data being image data of submillimeter resolution image acquired by a sensor aerial imager at a set of imaging sites along a flight path extending over the ground area. Referring to the examples recorded with respect to the previous drawings, the image data content received at stage 1910 may be part or all of the image data content transmitted at stage 1840 of method 1800, and/or part or all of the content of image data transmitted by the communication module 230 of the system 200 (MUTATIS MUTANDIS). Stage 1910 can be performed by the 1310 communication module of the 1300 server.
[00274] Method 1900 continues with stage 1920 for processing image data content to generate terrestrial data that includes terrestrial image data. Referring to the registered examples with respect to the following drawing, stage 1920 can be executed by the server processing module 320 (MUTATIS MUTANDIS). It is noted that different types of image data content processing can be performed at stage 1920. Especially, any processing techniques discussed in relation to stage 550 can be included at stage 1920.
[00275] The term “terrestrial data” is pertinent to data that relate to land and/or soil. In some implementations of the invention, the term “terrestrial data” can be interpreted broadly to also include objects that touch the ground, whether living objects (eg, caterpillars, fallen leaves) or inanimate objects (eg, pipes, sprayers). However, some implementations of method 1900 (and server 1300) are implemented in a stricter sense, in which the term “terrestrial data” is pertinent only to the ground itself (top of the ground, stones, etc.).
[00276] Optionally, stage 1920 processing may include analyzing image data content to identify significant agronomic data to provide terrestrial data. For example, such selected agronomic data can be selected images that clearly show the type of soil, images in which parasites, caterpillars, or other living creatures are shown, images in which pipe breakage or wear is shown, and so on.
[00277] Optionally, stage 1920 processing may include analyzing image data content to identify significant terrestrial data selected within the image data content; and processing the significant terrestrial data to provide the terrestrial data. For example, significant terrestrial data may include images in which the soil type is shown, images that are indicative of the content of lower soil layers (lower than the top of the soil) that may be exposed in some areas, and so on. against.
[00278] Optionally, stage 1920 processing may include applying computerized processing algorithms to image data content to differentiate between areas with different soil types in the terrain area. Different types of soil can be different types of soil, rocks, stones and/or other minerals.
[00279] Optionally, stage 1920 processing may include determining a composition of materials in the ground area, and generating the terrestrial data in response to a determination result.
[00280] Stage 1930 of method 1900 includes transmitting terrestrial data to a remote end-user system. Referring to the examples registered with respect to the following drawing, stage 1930 can be performed by the communication module 1310 of the server 1300.
[00281] Referring to the examples recorded with respect to the previous drawings, the terrestrial data transmitted at stage 1930 can be transmitted to various entities such as agricultural airplane 991, agronomist 992, soil scientist, geologist, and/or farmer 993.
[00282] It is noted that method 1900 can be executed by a server (such as server 1300) that supports the various variations discussed with respect to method 1800, MUTATIS MUTANDIS.
[00283] Method 1900 may further include a stage of applying computerized processing algorithms to terrestrial data to select, from a plurality of possible recipients, a recipient for the terrestrial image data, based on the terrestrial experience of the possible recipients. Stage 1930 broadcast can be performed based on selection results.
[00284] Referring to method 1900 generally, it is noted that image data content may be based on image data acquired at a set of imaging locations along the flight path that are located less than 20 meters above the ground area.
[00285] Fig. 16 is a functional block diagram illustrating an example server 300 used for agricultural monitoring, in accordance with examples of the subject matter presently disclosed. Server 1300 may include communication module 1310 and server processing module 1320, as well as additional components omitted for reasons of simplicity (e.g., power supply, user interface, etc.).
[00286] As discussed in more detail above, the received image data content can be based on image data obtained in low flight over the ground area. Especially, the image data content can be based on image data acquired at a set of imaging locations along the flight path that are located less than 20 meters above the ground area of the terrain.
[00287] Although certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, equivalents will now occur to those of ordinary experience in the art. It is to be understood, therefore, that the appended claims are intended to cover all such modifications and changes as they fall within the true spirit of the invention.
[00288] It will be appreciated that the configurations described above are cited by way of example, and various features thereof and combinations of these features can be varied and modified.
[00289] Although various configurations have been shown and described, it will be understood that there is no intention to limit the invention by such disclosure, but rather is intended to cover all modifications and alternative constructions falling within the scope of the invention as defined in the claims attached.
权利要求:
Claims (15)
[0001]
1. Method for agricultural monitoring, comprising: - flying an aerial image forming sensor along a flight path over an agricultural area in which a crop grows; - acquire by the aerial imager sensor, configured with a soil sampling distance (GSD) less than 0.75 mm, image data from parts of the agricultural area, with the image data acquisition being performed in a set of imaging sites along the flight path, including a plurality of low-altitude imaging sites that allow acquisition of the image data at submillimeter image resolution; and wherein the acquisition comprises acquiring image data at a set of imaging sites while flying the aerial imager sensor over the imaging sites at speeds that do not fall below 50% of the average platform speed along the flight path, the method characterized in that the acquisition further comprises: mechanically rotating at least one optical component of the aerial imager sensor with respect to an aerial transport platform to compensate for the movement of the aerial imager sensor with respect to plantations during acquisition; and concurrently with the rotation of the at least one optical component, for each frame of a plurality of frames of image data, initiating an image forming sensor focusing process when an acquisition optical axis is at a degree greater than 20° from the vertical geometry axis, and acquire the image data using vertical imaging, when the acquisition optical geometry axis is at an angle less than 20o from the vertical geometry axis; - transmit to an external system image data content that is based on the image data acquired by the aerial imager sensor; and - apply computerized processing algorithms to image data content to detect leaf diseases or indication of parasite effect on leaves, on one or more plants in the agricultural area, or apply computerized processing algorithms to image data to identify significant data agronomic data, and generate agronomic image data for transmission to a remote system based on the selected agronomic significant data, or apply computerized processing algorithms to the selected agronomic significant data to select, from a plurality of possible recipients, a recipient for the agronomic image data , based on the agronomic experience of the possible recipients.
[0002]
2. Method according to claim 1, characterized in that it comprises transmitting the image data content to the external system to display to an agronomist at a remote location agronomic image data that is based on the image data content , thus allowing the agronomist to remotely analyze the agricultural area.
[0003]
3. Method according to claim 1, characterized in that the flight path is a terrain following the flight path.
[0004]
4. Method according to claim 1, characterized in that the acquisition comprises mechanically moving at least one component of the aerial image forming sensor with respect to an aerial cargo platform, to compensate for the movement of the aerial image forming sensor with respect to plantations during acquisition.
[0005]
5. Method according to claim 1, characterized in that the acquisition comprises illuminating the plantations during the acquisition, to compensate the movement of the aerial imager sensor with respect to the plantations during the acquisition.
[0006]
6. Method according to claim 1, characterized in that flying comprises flying the aerial image forming sensor along a flight path that extends over at least a first farm of a first owner and a second property farm of a second owner different from the first owner, the method comprising: - acquiring first image data of parts of the first farm and acquiring second image data of parts of the second farm; - generating first image data content based on the first image data and generating second image data content based on the second image data; - to provide the first image data content to a first entity in a first message, and to provide the second data content to a second entity in a second message.
[0007]
7. Method according to claim 1, characterized in that flying comprises flying the aerial imager sensor by an agricultural aircraft that is configured for aerial application of crop protection products, preferably further comprising selecting aerial application parameters for aerial application of crop protection products by agricultural aircraft based on image data processing.
[0008]
8. Method according to claim 1, characterized in that the set of imaging sites along the flight path are located less than 20 meters above the top of plantations growing in the agricultural area.
[0009]
9. Method according to claim 1, characterized in that the transmission is followed by subsequent instance of flight, acquisition and transmission, the method further comprising planning a trajectory for subsequent instance of flight, based on the data acquired in a previous instance of acquisition.
[0010]
10. Method according to claim 1, characterized in that the flight is preceded by defining a surveillance flight plan for an air surveillance system, the surveillance flight plan comprising acquisition location plan indicative of a plurality of imaging sites, where the aerial sensor flight is part of flying the aerial surveillance system along a trajectory over an agricultural area, based on the surveillance flight plan.
[0011]
11. Method according to claim 1, characterized in that the flight is preceded by defining a surveillance flight plan for an air surveillance system, the surveillance flight plan comprising acquisition location plan indicative of a plurality of image formation sites; - where the aerial sensor flight is part of flying over an agricultural area, based on the surveillance flight plan, flying the aerial surveillance system along a trajectory over a land area; - based on the acquisition location plan, acquire image data during the flight by the aerial surveillance system of parts of the land area in sub-millimeter image resolution; and - transmitting to an external system image data content that is based on the image data acquired by the air surveillance system.
[0012]
12. Method according to claim 11, characterized in that the land area comprises different types of soil, and the acquisition comprises acquiring image data from different locations in the land area, to generate a ground map of the land area.
[0013]
13. Method according to claim 11, characterized in that the acquisition comprises acquiring image data that is indicative of the material composition of different locations in the terrestrial area or where the acquisition comprises acquiring image data that is indicative of the level of preparation from different places in the terrestrial area.
[0014]
14. Agricultural monitoring system, characterized in that it comprises: - an imaging sensor, configured and operable to acquire image data in submillimeter image resolution of parts of an agricultural area in which crops grow, when the imaging sensor is flying; - a communication module, configured and operable to transmit to an external system image data content that is based on the image data acquired by the aerial image forming sensor; and - an operable connector for connecting the imaging sensor and the communication module to an aerial platform.
[0015]
15. Agricultural monitoring system according to claim 14, characterized in that it comprises at least one mechanical coupling that couples at least one component of the imaging sensor to an engine, whereby the movement of the engine mechanically moves the at least an imaging sensor component with respect to the aerial platform concurrently with the acquisition of image data by the imaging sensor, preferably comprising a motor operable to mechanically rotate at least one optical component of the imaging sensor with respect to the aerial platform, to compensate for the movement of the imaging sensor with respect to the crops during acquisition; - whereby the imaging sensor is configured and operable to: (a) initiate a focusing process concurrently with the rotation of the at least one optical component when an acquisition optical geometry axis is at a degree greater than 20o from the axis vertical geometry, and (b) acquire the image data using vertical imaging, when the acquisition optical geometry axis is at a degree less than 20o from the vertical axis.
类似技术:
公开号 | 公开日 | 专利标题
BR112017014855B1|2021-06-08|method for agricultural monitoring, method for monitoring a soil area and agricultural monitoring system
Zhong et al.2018|Mini-UAV-borne hyperspectral remote sensing: From observation and processing to applications
Pádua et al.2017|UAS, sensors, and data processing in agroforestry: A review towards practical applications
Yang et al.2017|Unmanned aerial vehicle remote sensing for field-based crop phenotyping: current status and perspectives
Zhang et al.2012|The application of small unmanned aerial systems for precision agriculture: a review
Katsigiannis et al.2016|An autonomous multi-sensor UAV system for reduced-input precision agriculture applications
Pajares2015|Overview and current status of remote sensing applications based on unmanned aerial vehicles |
CN107148633B|2020-12-01|Method for agronomic and agricultural monitoring using unmanned aerial vehicle system
US20200264154A1|2020-08-20|System for monitoring crops and soil conditions
AU2015315327B2|2018-07-26|System and method for calibrating imaging measurements taken from aerial vehicles
Kulbacki et al.2018|Survey of drones for agriculture automation from planting to harvest
Abdullahi et al.2015|Technology impact on agricultural productivity: A review of precision agriculture using unmanned aerial vehicles
Samseemoung et al.2012|Application of low altitude remote sensing | platform for monitoring crop growth and weed infestation in a soybean plantation
Krishna2018|Agricultural drones: a peaceful pursuit
Swain et al.2012|Rice crop monitoring with unmanned helicopter remote sensing images
Saura et al.2019|Mapping multispectral digital images using a cloud computing software: applications from UAV images
US20180348760A1|2018-12-06|Automatic Change Detection System
Chavez et al.2020|A decade of unmanned aerial systems in irrigated agriculture in the Western US
Negash et al.2019|Emerging UAV applications in agriculture
Che'Ya2016|Site-Specific Weed Management Using Remote Sensing
Miller et al.2017|Providing aerial images through UAVs
Dehaan2015|Evaluation of Unmanned Aerial Vehicle |-Derived Imagery for the Detection of Wild Radish in Wheat
Palos Sánchez et al.2019|Mapping multispectral Digital Images using a Cloud Computing software: applications from UAV images
Doddamani et al.2020|Role of Drones in Modern Agricultural Applications
Jackson et al.2017|How remote sensing is offering complementing and diverging opportunities for precision agriculture users and researchers
同族专利:
公开号 | 公开日
US20170374323A1|2017-12-28|
AU2015376053B2|2019-07-11|
EA037035B1|2021-01-28|
ZA201705448B|2018-07-25|
CN107426958B|2020-08-25|
CA2973319A1|2016-07-14|
CA2973319C|2020-10-13|
IL236606A|2020-09-30|
EP3242544A4|2018-08-22|
EA201791589A1|2017-12-29|
CN107426958A|2017-12-01|
US11050979B2|2021-06-29|
EP3242544A1|2017-11-15|
BR112017014855A2|2018-01-09|
MX2017009061A|2018-03-15|
US10182214B2|2019-01-15|
WO2016110832A1|2016-07-14|
US20190253673A1|2019-08-15|
AU2015376053A1|2017-08-24|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5517193A|1993-04-30|1996-05-14|International Business Machines Corporation|Meteorological workstation|
US5467271A|1993-12-17|1995-11-14|Trw, Inc.|Mapping and analysis system for precision farming applications|
US5798786A|1996-05-07|1998-08-25|Recon/Optical, Inc.|Electro-optical imaging detector array for a moving vehicle which includes two axis image motion compensation and transfers pixels in row directions and column directions|
US6266063B1|1997-10-20|2001-07-24|Baron Services, Inc.|Real-time three-dimensional weather display method and weathercast system|
JP3932222B2|1998-02-23|2007-06-20|ヤンマー農機株式会社|Precision farming|
FI112402B|1999-10-28|2003-11-28|Diware Oy|Method for determining the characteristics of tree stocks and computer programs for performing the procedure|
US6422508B1|2000-04-05|2002-07-23|Galileo Group, Inc.|System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods|
IL156424D0|2000-12-15|2004-01-04|Nooly Technologies Ltd|Location-based weather nowcast system and method|
US7149366B1|2001-09-12|2006-12-12|Flight Landata, Inc.|High-definition hyperspectral imaging system|
RU2207504C1|2001-12-06|2003-06-27|Закрытое акционерное общество "ЦКМ"|Method for large-scale aerial photography|
US6653947B2|2002-02-20|2003-11-25|Honeywell International Inc.|Apparatus for the display of weather and terrain information on a single display|
US20090297049A1|2005-07-07|2009-12-03|Rafael Advanced Defense Systems Ltd.|Detection of partially occluded targets in ladar images|
US20070188605A1|2006-02-14|2007-08-16|Deere & Company, A Delaware Corporation|Irrigation remote sensing system|
US7417210B2|2006-06-30|2008-08-26|Northrop Grumman Corporation|Multi-spectral sensor system and methods|
US7917346B2|2008-02-19|2011-03-29|Harris Corporation|Geospatial modeling system providing simulated tree trunks and branches for groups of tree crown vegetation points and related methods|
US9274250B2|2008-11-13|2016-03-01|Saint Louis University|Apparatus and method for providing environmental predictive indicators to emergency response managers|
US8577518B2|2009-05-27|2013-11-05|American Aerospace Advisors, Inc.|Airborne right of way autonomous imager|
US8537337B2|2009-12-22|2013-09-17|Weyerhaeuser Nr Company|Method and apparatus for analyzing tree canopies with LiDAR data|
JP5722349B2|2010-01-29|2015-05-20|トムソン ライセンシングThomson Licensing|Block-based interleaving|
AU2011213545A1|2010-02-02|2012-08-16|Australian Rain Technologies Pty Limited|Estimation of weather modification effects|
MY173920A|2010-06-04|2020-02-27|Univ Malaysia Perlis|A flying apparatus for aerial agricultural application|
US8538695B2|2010-06-30|2013-09-17|Weyerhaeuser Nr Company|System and method for analyzing trees in LiDAR data using views|
US8768667B2|2010-10-25|2014-07-01|Trimble Navigation Limited|Water erosion management incorporating topography, soil type, and weather statistics|
WO2012058652A2|2010-10-29|2012-05-03|Drexel University|Tunable electro-optic filter stack|
US8897483B2|2010-11-09|2014-11-25|Intelescope Solutions Ltd.|System and method for inventorying vegetal substance|
WO2012092554A1|2010-12-30|2012-07-05|Utility Risk Management Corporation, Llc|Method for locating vegetation having a potential to impact a structure|
US9756844B2|2011-05-13|2017-09-12|The Climate Corporation|Method and system to map biological pests in agricultural fields using remotely-sensed data for field scouting and targeted chemical application|
US8775081B2|2011-09-26|2014-07-08|Weyerhaeuser Nr Company|Method and apparatus for sorting LiDAR data|
AU2012325242B2|2011-10-21|2015-07-16|Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V.|Optical device and method for measuring a complexly formed object|
BR112014012224B1|2011-11-22|2019-04-09|Precision Planting Llc|METHOD FOR MEASURING A TALO DIAMETER ACCORDING TO AN AGRICULTURAL HARVEST THROUGH A FIELD, AND A TALO SENSOR SYSTEM FOR USE WITH AN AGRICULTURAL HARVESTER WHILE THE HARVEST THREADS A FIELD|
US10327393B2|2013-03-07|2019-06-25|Blue River Technology Inc.|Modular precision agriculture system|
US9819964B2|2012-05-04|2017-11-14|Environmental Systems Research Institute, Inc.|Limited error raster compression|
US10520482B2|2012-06-01|2019-12-31|Agerpoint, Inc.|Systems and methods for monitoring agricultural products|
UA116355C2|2012-08-10|2018-03-12|Зе Клаймат Корпорейшн|Systems and methods for control, monitoring and mapping of agricultural applications|
US9063544B2|2012-09-19|2015-06-23|The Boeing Company|Aerial forest inventory system|
US20140089045A1|2012-09-27|2014-03-27|Superior Edge, Inc.|Methods, apparatus and systems for determining stand population, stand consistency and stand quality in an agricultural crop and alerting users|
US20140316614A1|2012-12-17|2014-10-23|David L. Newman|Drone for collecting images and system for categorizing image data|
EP3909408A4|2012-12-17|2021-11-17|Climate Corp|Plot placement systems and methods|
US20140312165A1|2013-03-15|2014-10-23|Armen Mkrtchyan|Methods, apparatus and systems for aerial assessment of ground surfaces|
EP2996453A4|2013-05-17|2017-05-10|The Climate Corporation|System for soil moisture monitoring|
US8849523B1|2013-05-20|2014-09-30|Elwha Llc|Systems and methods for detecting soil characteristics|
US9767521B2|2013-08-30|2017-09-19|The Climate Corporation|Agricultural spatial data processing systems and methods|
WO2015102731A2|2013-10-18|2015-07-09|Aerovironment, Inc.|Privacy shield for unmanned aerial systems|
CN103523226B|2013-10-31|2015-09-30|无锡同春新能源科技有限公司|A kind of unmanned plane with the colored rice disease image identifying instrument water prevention sheath and culm blight of rice|
CN103523224A|2013-10-31|2014-01-22|无锡同春新能源科技有限公司|Unmanned aerial vehicle with colorful rice disease image recognition instrument and used for preventing and controlling rice bacterial leaf blight|
CN203528823U|2013-10-31|2014-04-09|无锡同春新能源科技有限公司|Rice bacterial leaf blight preventing unmanned aerial vehicle with colored rice disease image identifier|
CN203528822U|2013-10-31|2014-04-09|无锡同春新能源科技有限公司|Rice sheath blight disease preventing unmanned aerial vehicle with colored rice disease image identifier|
US10095995B2|2013-11-25|2018-10-09|First Resource Management Group Inc.|Apparatus for and method of forest-inventory management|
WO2015156884A2|2014-01-22|2015-10-15|Izak Van Cruyningen|Forward motion compensated flight path|
DE202014002338U1|2014-03-15|2014-05-14|Volker Jung|Largely autonomous flying UAV helicopter drone for application of pesticides in agriculture, forestry and viticulture |
US9974226B2|2014-04-21|2018-05-22|The Climate Corporation|Generating an agriculture prescription|
US9813601B2|2014-05-06|2017-11-07|Urugus S.A.|Imaging device for scenes in apparent motion|
CN104050649A|2014-06-13|2014-09-17|北京农业信息技术研究中心|Agricultural remote sensing system|
US9641736B2|2014-06-20|2017-05-02|nearmap australia pty ltd.|Wide-area aerial camera systems|
US9709987B2|2014-07-31|2017-07-18|Elwha Llc|Systems and methods for deactivating plant material outside of a growing region|
US9717178B1|2014-08-08|2017-08-01|The Climate Corporation|Systems and method for monitoring, controlling, and displaying field operations|
WO2016029054A1|2014-08-22|2016-02-25|The Climate Corporation|Methods for agronomic and agricultural monitoring using unmanned aerial systems|
US10109024B2|2014-09-05|2018-10-23|The Climate Corporation|Collecting data to generate an agricultural prescription|
US9519861B1|2014-09-12|2016-12-13|The Climate Corporation|Generating digital models of nutrients available to a crop over the course of the crop's development based on weather and soil data|
CN104199425B|2014-09-15|2017-10-24|中国农业科学院农业信息研究所|A kind of reading intelligent agriculture monitoring early-warning system and method|
WO2016065071A1|2014-10-21|2016-04-28|Tolo, Inc.|Remote detection of insect infestation|
WO2016099723A2|2014-11-12|2016-06-23|SlantRange, Inc.|Systems and methods for aggregating and facilitating the display of spatially variable geographic data acquired by airborne vehicles|
IL236606A|2015-01-11|2020-09-30|Gornik Amihay|Systems and methods for agricultural monitoring|
USD768626S1|2015-03-05|2016-10-11|The Climate Corporation|Data processing device|
USD783609S1|2015-05-07|2017-04-11|The Climate Corporation|Data storage device|
WO2016187386A1|2015-05-19|2016-11-24|The Climate Corporation|Protective connector and applications thereof|
CN105116407B|2015-06-26|2017-08-08|北京师范大学|A kind of method that vegetation coverage is measured using handheld laser range finder|
US9969492B2|2015-09-04|2018-05-15|Nutech Ventures|Crop height estimation with unmanned aerial vehicles|
US10025983B2|2015-09-21|2018-07-17|The Climate Corporation|Ponding water detection on satellite imagery|
US10046187B2|2015-10-09|2018-08-14|Leonard E. Doten|Wildfire aerial fighting system utilizing lidar|
US9721181B2|2015-12-07|2017-08-01|The Climate Corporation|Cloud detection on remote sensing imagery|
CN105527969B|2015-12-17|2018-07-06|中国科学院测量与地球物理研究所|A kind of mountain garden belt investigation and monitoring method based on unmanned plane|
US10149422B2|2015-12-18|2018-12-11|Realmfive, Inc.|Autonomous integrated farming system|
US11102940B2|2016-03-31|2021-08-31|Husqvarna Ab|Forestry management device|
US9881214B1|2016-07-13|2018-01-30|The Climate Corporation|Generating pixel maps from non-image data and difference metrics for pixel maps|
CN106408578A|2016-09-22|2017-02-15|北京数字绿土科技有限公司|Single-tree segmentation method and device|
US10028451B2|2016-11-16|2018-07-24|The Climate Corporation|Identifying management zones in agricultural fields and generating planting plans for the zones|
US10204270B2|2016-11-17|2019-02-12|Fruitspec Ltd|Method and system for crop yield estimation|
CN107238574A|2017-06-07|2017-10-10|江苏大学|The detection of plant growing way and the diagnostic method of fertilising are targetted towards cotton|
US20200341118A1|2019-04-26|2020-10-29|Waymo Llc|Mirrors to extend sensor field of view in self-driving vehicles|IL236606A|2015-01-11|2020-09-30|Gornik Amihay|Systems and methods for agricultural monitoring|
US10597156B2|2015-04-15|2020-03-24|Pierre Emmanuel VIEL|Cleaning drone|
US10322801B1|2015-06-12|2019-06-18|Amazon Technologies, Inc.|Unmanned aerial vehicle based surveillance as a service|
US10313638B1|2015-06-12|2019-06-04|Amazon Technologies, Inc.|Image creation using geo-fence data|
CN113238581A|2016-02-29|2021-08-10|星克跃尔株式会社|Method and system for flight control of unmanned aerial vehicle|
US10474144B2|2016-08-01|2019-11-12|The United States Of America, As Represented By The Secretary Of The Navy|Remote information collection, situational awareness, and adaptive response system for improving advance threat awareness and hazardous risk avoidance|
EP3500087B1|2016-08-18|2021-12-08|Tevel Aerobotics Technologies Ltd|System and method for plantation agriculture tasks management and data collection|
US11244398B2|2016-09-21|2022-02-08|Iunu, Inc.|Plant provenance and data products from computer object recognition driven tracking|
US10791037B2|2016-09-21|2020-09-29|Iunu, Inc.|Reliable transfer of numerous geographically distributed large files to a centralized store|
US10635274B2|2016-09-21|2020-04-28|Iunu, Inc.|Horticultural care tracking, validation and verification|
US10627386B2|2016-10-12|2020-04-21|Aker Technologies, Inc.|System for monitoring crops and soil conditions|
RU2749033C2|2016-10-13|2021-06-03|Маккейн Фудс Лимитед|Method, medium and system for detecting potato virus in agricultural crop image|
US10509378B2|2016-11-07|2019-12-17|FarmX Inc.|Systems and methods for soil modeling and automatic irrigation control|
US10533956B2|2016-12-14|2020-01-14|FarmX Inc.|Multi-depth soil moisture monitoring systems and methods to evaluate soil type, packaged in small round polyvinyl chloride tube, with potting and rodent protection, for effective measurements and installation|
US10664702B2|2016-12-30|2020-05-26|International Business Machines Corporation|Method and system for crop recognition and boundary delineation|
US10445877B2|2016-12-30|2019-10-15|International Business Machines Corporation|Method and system for crop recognition and boundary delineation|
US10586105B2|2016-12-30|2020-03-10|International Business Machines Corporation|Method and system for crop type identification using satellite observation and weather data|
US10746720B2|2017-01-13|2020-08-18|FarmX Inc.|Soil moisture monitoring systems and methods for measuring mutual inductance of area of influence using radio frequency stimulus|
US11266054B2|2017-01-24|2022-03-08|Cnh Industrial America Llc|System and method for automatically estimating and adjusting crop residue parameters as a tillage operation is being performed|
US10123475B2|2017-02-03|2018-11-13|Cnh Industrial America Llc|System and method for automatically monitoring soil surface roughness|
US10909367B2|2017-03-02|2021-02-02|Basecamp Networks, LLC|Automated diagnosis and treatment of crop infestations|
US10262206B2|2017-05-16|2019-04-16|Cnh Industrial America Llc|Vision-based system for acquiring crop residue data and related calibration methods|
US10438302B2|2017-08-28|2019-10-08|The Climate Corporation|Crop disease recognition and yield estimation|
US10514554B2|2017-09-30|2019-12-24|Pixart Imaging Inc.|Optical motion detecting device for a flight vehicle|
US10423850B2|2017-10-05|2019-09-24|The Climate Corporation|Disease recognition from images having a large field of view|
US10631477B2|2017-10-30|2020-04-28|Valmont Industries, Inc.|System and method for irrigation management|
US10779458B2|2017-12-01|2020-09-22|International Business Machines Corporation|Monitoring aerial application tasks and recommending corrective actions|
US10621434B2|2018-01-25|2020-04-14|International Business Machines Corporation|Identification and localization of anomalous crop health patterns|
US10607406B2|2018-01-25|2020-03-31|General Electric Company|Automated and adaptive three-dimensional robotic site surveying|
US11062516B2|2018-02-07|2021-07-13|Iunu, Inc.|Augmented reality based horticultural care tracking|
US10769466B2|2018-02-20|2020-09-08|International Business Machines Corporation|Precision aware drone-based object mapping based on spatial pattern recognition|
US10679056B2|2018-04-06|2020-06-09|Cnh Industrial America Llc|Augmented reality for plant stand management|
US10719709B2|2018-04-06|2020-07-21|Cnh Industrial America Llc|Augmented reality for plant stand management|
EP3574751A1|2018-05-28|2019-12-04|Bayer Animal Health GmbH|Apparatus for fly management|
US11144775B2|2018-06-25|2021-10-12|Cnh Industrial Canada, Ltd.|System and method for illuminating the field of view of a vision-based sensor mounted on an agricultural machine|
US11166404B2|2018-09-02|2021-11-09|FarmX Inc.|Systems and methods for virtual agronomic sensing|
US10779476B2|2018-09-11|2020-09-22|Pollen Systems Corporation|Crop management method and apparatus with autonomous vehicles|
US10660277B2|2018-09-11|2020-05-26|Pollen Systems Corporation|Vine growing management method and apparatus with autonomous vehicles|
WO2021221704A1|2020-04-29|2021-11-04|Pollen Systems Corporation|Crop management method and apparatus with autonomous vehicles|
US11108849B2|2018-12-03|2021-08-31|At&T Intellectual Property I, L.P.|Global internet of thingsquality of servicerealization through collaborative edge gateways|
CN109711272A|2018-12-04|2019-05-03|量子云未来(北京)信息科技有限公司|Crops intelligent management method, system, electronic equipment and storage medium|
EP3679776A1|2019-01-11|2020-07-15|GE Aviation Systems Limited|Method of collecting soil data via an uav|
US10659144B1|2019-01-31|2020-05-19|At&T Intellectual Property I, L.P.|Management of massively distributed internet of thingsgateways based on software-defined networkingvia fly-by master drones|
JP2020166584A|2019-03-29|2020-10-08|トヨタ自動車株式会社|Image information collection system and vehicle|
CN110070417A|2019-04-19|2019-07-30|北方天途航空技术发展(北京)有限公司|Agricultural plant protection unmanned plane job management system and method|
US10957036B2|2019-05-17|2021-03-23|Ceres Imaging, Inc.|Methods and systems for crop pest management utilizing geospatial images and microclimate data|
WO2020255373A1|2019-06-21|2020-12-24|株式会社センシンロボティクス|Flight management server and flight management system for unmanned aerial vehicle|
CN110691181B|2019-09-09|2021-10-08|苏州臻迪智能科技有限公司|Camera equipment, camera and unmanned aerial vehicle|
CN111114814A|2020-01-16|2020-05-08|四川川测研地科技有限公司|Self-adaptive focusing holder for linear engineering object based on unmanned aerial vehicle|
法律状态:
2019-09-03| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-04-27| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-06-08| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 02/12/2015, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
IL236606|2015-01-11|
IL236606A|IL236606A|2015-01-11|2015-01-11|Systems and methods for agricultural monitoring|
PCT/IL2015/051169|WO2016110832A1|2015-01-11|2015-12-02|Systems and methods for agricultural monitoring|
[返回顶部]